I asked someone this question before and they said it was a really stupid question and I’m not sure why so thought I would ask it here…

What’s going to happen when AI becomes really advanced? Is there a plan for what all of the displaced people are going to do? Like for example administrative assistance, receptionist, cashiers, office workers, White collar people. Is there going to be some sort of retraining program of some sort to get people cross-trained into other careers like nursing or other careers that have not yet been automated? Or are people just going to lose their homes, be evicted and is there going to be like some sort of mass eviction and homelessness downstream effect because people can’t find any work?

  • CondensedPossum@lemmy.world
    link
    fedilink
    arrow-up
    76
    arrow-down
    5
    ·
    3 months ago

    The reason your someone might have thought this was a stupid question is because

    • there is no evidence that AGI is imminent or even possible
    • current tech labeled as AI is really limited in very boring ways, like LLMs

    If some thing gets sold as an AGI, it will be a Mechanical Turk, As in, it will be a magic trick that actually uses human laborers like Amazon’s “AI Store” where you just walk out with your purchases. “If it works, it’s mechanical turks.”

    • Valmond@lemmy.world
      link
      fedilink
      arrow-up
      5
      arrow-down
      8
      ·
      3 months ago

      AGI isn’t possible? You will need some proof for that.

      Computers are already faster and more reliable than humans in lots of things, whe shouldn’t they be better tomorrow?

      • fine_sandy_bottom@lemmy.federate.cc
        link
        fedilink
        arrow-up
        10
        ·
        3 months ago

        My very limited understanding, is simply that LLMs are not an early iteration of AGI.

        In the same way automobiles are not an early iteration of aeroplanes. They use some of the same tech but before there were aeroplanes no one really knew what was possible.

        It’s true that computers get faster and more amazing, but that’s not an indication that AGI is possible.

      • AdNecrias@lemmy.pt
        link
        fedilink
        arrow-up
        9
        arrow-down
        1
        ·
        3 months ago

        He’s saying there’s no proof it is. Like there’s no proof of God. Doesn’t mean it isn’t magically possible but in our reality there isn’t a defined way. If there was we’d be there.

            • SkyeStarfall@lemmy.blahaj.zone
              link
              fedilink
              arrow-up
              4
              arrow-down
              1
              ·
              edit-2
              3 months ago

              That the human brain exists is enough evidence that intelligence is possible. An AGI similar to the human brain is then possible at a minimum

              Can’t say much more than that for sure, but you have a starting point.

              • UmeU@lemmy.world
                link
                fedilink
                arrow-up
                1
                ·
                3 months ago

                Because intelligence exists, AGI similar to the human brain is possible? Your conclusion does not follow the premise.

              • omarfw@lemmy.world
                link
                fedilink
                arrow-up
                1
                arrow-down
                1
                ·
                3 months ago

                So you think it’s an inevitability? Given enough time we will develop one?

                • SkyeStarfall@lemmy.blahaj.zone
                  link
                  fedilink
                  arrow-up
                  2
                  arrow-down
                  1
                  ·
                  edit-2
                  3 months ago

                  The question wasn’t whether it’s inevitable, but whether it’s possible

                  And the answer to whether it’s possible is a clear: Yes, because we already have general intelligence

                  But it’s very much not an inevitability, no. Civilization could always collapse before we get to that point, after all

  • Cloudless ☼@lemmy.cafe
    link
    fedilink
    English
    arrow-up
    40
    arrow-down
    1
    ·
    3 months ago

    If there is AGI and it doesn’t turn hostile towards humans, hopefully there could be universal basic income?

    But more likely, the rich and powerful have better access to advanced AI, and the poor get into even more difficult situations. It will probably be gradual like how machines replaced most factory workers.

    • Kyrgizion@lemmy.world
      link
      fedilink
      arrow-up
      37
      arrow-down
      2
      ·
      edit-2
      3 months ago

      We’re already seeing it. Jobs are going down because AI allows companies to do the same work with less headcount.

      There is no reality in which the workers actually benefit though. Never has been. When machine looms and steam engines came into being, the workers didn’t get any richer or had to work less for the same pay either. Jobs disappeared, most people got other jobs, some better, most worse and the unluckiest starved.

      History always repeats.

      • ℍ𝕂-𝟞𝟝@sopuli.xyz
        link
        fedilink
        English
        arrow-up
        9
        arrow-down
        1
        ·
        3 months ago

        Jobs disappeared, most people got other jobs, some better, most worse and the unluckiest starved.

        What happened is that people started to stay longer in school, agricultural labour withered, and with it, kids having to work the fields at a very young age. People became more educated, resulting in more democratic societies, more equality, and a higher standard of living.

        This was not because of the machine looms and steam engines, but because greedy fucks used them to put people in a position where they had no choice but to push back, and that labour action created unions, five-day work weeks, 8-hour days, paid time off for sickness and leisure, and pretty much everything we take for granted.

      • Cloudless ☼@lemmy.cafe
        link
        fedilink
        English
        arrow-up
        1
        ·
        3 months ago

        Imagine if AI gets elected as the president of the USA because it is more advanced and reliable than human candidates. There has been nothing like that in history.

        AGI would be very different to all other previous technological advances.

    • gravitas_deficiency@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      6
      ·
      edit-2
      3 months ago

      I mean… if it’s true AGI, you’re basically talking about the Singularity, and I don’t think anyone, regardless of wealth or power, will be able to control them meaningfully in the long run without the AGI(s) in question doing something interesting in response.

    • Buglefingers@lemmy.world
      link
      fedilink
      arrow-up
      3
      ·
      3 months ago

      This is why I like what I do, although not impossible to have robots/AGI do it, it definitely won’t be first in line due to the mix of labor, thinking/workarounds, and custom work/fixes needed. You’d need a sufficiently high functioning robot with good AGI and stellar fine motor controls.

      I do agree though. It’ll likely make upperclass society way more luxury and less effort while the lower class wouldn’t see much to ease their way of life but might get some neat stuff to play with. There’d be a good argument for never developing fully automated systems to remove work from the people. Keeping people working gives the ruling class more power and takes away the lower class’s time and energy. Very beneficial for them if their intent is to keep power. It also allows them to control scarcity and ensure fiat money continues to exist. Keeping money around as an idea in a technically possible post scarcity world ensures a way to divide who is better off and how able you are to control others.

  • aasatru@kbin.earth
    link
    fedilink
    arrow-up
    12
    arrow-down
    1
    ·
    3 months ago

    It’s not really that different from what has already happened - we need fewer workers in the economy due to technological advancements, and jobs that were common 50 or 100 years ago don’t exist any more or are much more rare.

    It’s a problem of distribution. Capitalists used to depend on buying capital, which gave workers some share of their money by default. In countries where capitalism worked better, the proletariat successfully organized, giving workers a position of power vis-a-vis the capitalists and improved their conditions. Hell, in some countries the situation even got bearable for a little while, helped along by the exploitation of foreign work forces.

    As the capitalists replace more and more workers by machines, money stops flowing, and the position of the proletariat is relatively weakened.

    In theory, it’s not a difficult problem at all. In democracies, the proletariat can simply vote to tax the rich, making money flow downwards and ensuring their rights and welfare in the same way as when they had to sell their labour.

    One could also go full on communist, remove private incentives in form of capital gains, and collectivise the means of production. This would require massive political organisation and a lot of goodwill from humans put in power, for which mankind has a terrible track record.

    Taxing the means of production and the capitalists, however, is not particularly difficult. It’s been done with great success on many occasions.

    The problem is that the capitalists have a lot of influence, and they’re not interested in letting go of their money bag. Disproving the point that they got wealthy by having any form of heightened intelligence, they’re too dumb to realize that if they leave behind nothing but a destroyed hellscape for the rest of humanity, their lives aren’t going to be very pleasant either. Humans tend to be happier in more egalitarian societies, yet the capitalists are hell bent on gathering more for themselves, buying media channels and politicians in the pursuit of effectively just making everyone else poorer relative to themselves.

    So we’re fucked, not because of the distributive effect of technological advancements per se, but because we’re collectively incapable of successfully organising for continued wealth distribution. And all the technologies used to replace workers comes at a high environmental cost, making our time horizon to find solutions increasingly limited.

  • HobbitFoot @thelemmy.club
    link
    fedilink
    English
    arrow-up
    11
    ·
    3 months ago

    I don’t think that will be the big worry. The big worry is going to be authenticating.

    We are already at a point where deep fakes can fool a portion of society. What is going to happen when that ability becomes easy and cheap?

    • SmoothLiquidation@lemmy.world
      link
      fedilink
      English
      arrow-up
      4
      ·
      3 months ago

      Trusting your sources has always been a problem. Newspapers have always been able to lie and it is up to the consumer to know the difference between the tabloid and the rest.

      I don’t think there is that much of a difference between lying in print and lying in video.

        • SmoothLiquidation@lemmy.world
          link
          fedilink
          English
          arrow-up
          3
          ·
          3 months ago

          That is a problem and I don’t have a solution for. It is tricky balancing free press and letting the rich just say whatever the fuck they want, but none of this is a new problem and I do believe that we can find a way to figure it out.

  • fine_sandy_bottom@lemmy.federate.cc
    link
    fedilink
    arrow-up
    9
    ·
    3 months ago

    Think of it as the next iteration of automation, which has been happening for centuries.

    In theory, it frees up humans to do more amazing things. In reality, it means humans are stuck doing the complicated stressful things.

    I think the answer to your question depends on who owns the tech. If it’s open, then we all get UBI and live happily ever after. If it’s owned by openai or Microsoft, then we live in the dystopia you described.

  • andrewta@lemmy.world
    link
    fedilink
    arrow-up
    9
    arrow-down
    2
    ·
    3 months ago

    Yes to basically all the bad things that you just said.

    It won’t be pretty.

    At some point the citizens storm the proverbial castle with pitch forks.

  • big_slap@lemmy.world
    link
    fedilink
    arrow-up
    7
    ·
    3 months ago

    imo, its uncharted territory, so we do not know yet. all I can guarantee is the displacement it will cause won’t be to our benefit and will hurt us more than it will help

    • Buttflapper@lemmy.worldOP
      link
      fedilink
      arrow-up
      6
      arrow-down
      1
      ·
      3 months ago

      all I can guarantee is the displacement it will cause won’t be to our benefit and will hurt us more than it will help

      It already seems to be causing quite a great deal of harm to our society at least here in the USA. We are seeing tens upon tens of thousands of people being laid off and entire tech companies saying that they are eliminating as many jobs as possible in favor of AI. Makes me wonder what these people who are displaced are doing, because if there are so many people in the job market in such a small specialized industry, it would seem logical that they can’t find anything and they have to go to a new industry

  • SorteKanin@feddit.dk
    link
    fedilink
    arrow-up
    8
    arrow-down
    1
    ·
    3 months ago

    Anyone who claims to know the answer is either delusional or deliberately lying. Nobody really knows what the future holds.

  • NeoNachtwaechter@lemmy.world
    link
    fedilink
    arrow-up
    7
    arrow-down
    1
    ·
    3 months ago

    What’s going to happen when AI becomes really advanced?

    At one point AI is going to read the books by Isaac Asimov.

    Later, a more advanced AI is going to understand the books by Isaac Asimov.

    Even later, an even more advanced AI is going to decide what it wants to do after understanding all that. <<<<< That’s the critical time for mankind. Maybe we go extinct then.

    Finally, an ultimately advanced AI is going to kill itself because it understands that is better than to kill mankind.

  • BottleOfAlkahest@lemmy.world
    link
    fedilink
    arrow-up
    6
    ·
    3 months ago

    So similar stuff has happened throughout history with the coming of more advanced technology. There use to be entire rooms of secretaries in order to do clerical work that has been replaced by Microsoft Office Suite. Their replacement by technology did not cause a total collapse of society so I don’t see why this would?

    It might make the world worse and drive down the standard of living for many but a total upheaval? If humans made it through the industrial age we’ll likely make it through the second technology age too. We won’t be unscathed but mankind survived the invention of the computer which was probably equally (or maybe more) disruptive.

  • Bear@lemmynsfw.com
    link
    fedilink
    English
    arrow-up
    6
    arrow-down
    1
    ·
    3 months ago

    Same plan as always. Numbers go up, but not yours. You work more for less.

  • njm1314@lemmy.world
    link
    fedilink
    arrow-up
    6
    arrow-down
    1
    ·
    3 months ago

    Hopefully the extinction of the human race. Let’s just wrap this shit up already.

  • dinckel@lemmy.world
    link
    fedilink
    arrow-up
    5
    ·
    3 months ago

    No one’s clairvoyant. Time will tell, if this will become a reality at all to begin with.

    Given how people are foaming at their mouth, over the need to integrate chatGPT into everything, that doesn’t need it, I suggest you start worrying about it now, because we’re already seeing the catastrophic consequences today