• Lugh@futurology.todayOPM
    link
    fedilink
    English
    arrow-up
    10
    ·
    9 months ago

    Some countries (such as the US) are already oversupplied with law school graduates. The implication of this research is that they will soon be even more oversupplied. Law degrees are expensive to obtain. Apart from tuition costs, you need to devote years to study when you are not earning anything.

    One of the obvious questions posed by this research is why should anyone invest tens or hundreds of thousands of dollars in starting to study law in 2024? The old assumption was that investment would pay for itself in lifetime earnings. Those old assumptions seem to be collapsing around us.

  • towerful@programming.dev
    link
    fedilink
    English
    arrow-up
    9
    ·
    9 months ago

    I presume there is no path from law school to senior lawyer/partner/whatever without first going through being a junior lawyer.
    So, replacing the hiring and training of junior lawyers to save some bucks by replacing them with AI will mean a lack of senior lawyers in a decade

    • Troy@lemmy.ca
      link
      fedilink
      English
      arrow-up
      10
      ·
      9 months ago

      LLMs replacing the entry level “clerk” work might be great from a financial perspective of the existing firms, you’re absolutely right that it’ll have long term effects. In the short term though, it’ll mean more law grads are going to fail to find work, and attrition effects will kick in earlier. So we’ll see even more mortgage brokers and real estate agenst and such with law degrees who have never practised.

      In any professional path, there’s a sort of junior-years attrition that occurs – a sort of professional darwinism where some make it through the grind of the junior years, and others peel off to adjacent careers. This will further frontload that attrition, because the number of junior positions available will be even smaller. But it won’t eliminate it.

      I experienced this funnel. I’m 40 now and survived it, starting my own business in my profession (geophysics). But the number of colleagues that I’ve seen bail over the years is astoundingly high. Some of them became remarkably successful by leveraging transferable skills from their initial profession, but others just end up as bartenders. The professions are vastly oversubscribed, sadly.

    • Bipta@kbin.social
      link
      fedilink
      arrow-up
      4
      arrow-down
      9
      ·
      edit-2
      9 months ago

      Oh sweet summer child. This is AI at the equivalent of a few week old baby and it’s already better than most people at most things.

      Wait until it can improve itself.

      • Sylver@lemmy.world
        link
        fedilink
        English
        arrow-up
        4
        ·
        9 months ago

        AI has been “improving” itself for years now, that is nothing new or marvel. Current LLM are not intelligent, they are datasets and statistical analysis. AI in its current ignorant form has been replacing mechanically inclined jobs ever since automation itself, but that is nowhere close to the fantasy setting AI that fanboys will try to swoon over and imagine.

        Maybe one day, but not with what we currently have!

  • Boinkage@lemmy.world
    link
    fedilink
    English
    arrow-up
    6
    ·
    edit-2
    9 months ago

    Legal research and writing is only one aspect of practicing law. How will your chat gpt associate appear in court? Make oral arguments? Stand up to object on the record? Obtain a bar ID number? Pass the bar? Counsel clients? Console a distraught client who has just lost their child, or home, or personal liberty? Search for new business? AI can do a lot of things but being a lawyer is much more than stringing sentences together with some Latin words thrown in.

  • givesomefucks@lemmy.world
    link
    fedilink
    English
    arrow-up
    5
    ·
    9 months ago

    The big firms adapt slowly.

    They might start using AI, but they’d still have a human review it, they’re not going to risk serious money to save what a junior employee makes.

    You will see a lot of lawyers thinking AI will allow them to open a business with zero employees, all that does is open up spots at the big places when they quit.

  • JTugger@lemmy.world
    link
    fedilink
    English
    arrow-up
    7
    arrow-down
    2
    ·
    9 months ago

    Those pointing to hallucinations and such are focused on Generative AI as it is today. However, it will be vastly different in 4-6 years when people leave law school if they start today. This technology is on a growth curve that is much more rapid than most, if not all, we have seen in history.

    A lot of the issues in AI today will be mitigated by the time the newly minted attorneys are ready to practice.

    • blargerer@kbin.social
      link
      fedilink
      arrow-up
      5
      arrow-down
      2
      ·
      9 months ago

      Hallucination isn’t a solvable quirk of GPTs, its their function. You can’t get rid of it by throwing more money at the problem, you’d need another idea.

      • JTugger@lemmy.world
        link
        fedilink
        English
        arrow-up
        3
        arrow-down
        2
        ·
        9 months ago

        There are tools to manage major hallucinations. More are coming. Automated fact checking, pattern analysis, multiple layer analysis, etc.

        Yes, there are functional mechanisms that power hallucinations. Especially in the probability models. But there are some powerful tools automate analysis of the outputs and rework for accuracy. Those are likely to improve to eventually reach a level of trust that is sufficient for many business use cases.

  • Hello_there@kbin.social
    link
    fedilink
    arrow-up
    5
    ·
    9 months ago

    There’s been a glut of lawyers in the market since the 2008 financial crash, and many people with law degrees and passed bar but aren’t practicing.
    There are too many law schools with a total annual enrollment far in excess of the available jobs.
    Don’t go to law school

  • kwedd@feddit.nl
    link
    fedilink
    English
    arrow-up
    4
    ·
    9 months ago

    Is there no risk of the LLM hallucinating cases or laws that don’t exist?

    • Bipta@kbin.social
      link
      fedilink
      arrow-up
      3
      arrow-down
      1
      ·
      edit-2
      9 months ago

      GPT4 is dramatically less likely to hallucinate than 3.5, and we’re barely starting the exponential growth curve.

      Is there a risk? Yes. Humans do it too though if you think about it, and all AI has to do is better than humans, which is a milestone it’s already got within sight.

  • Bipta@kbin.social
    link
    fedilink
    arrow-up
    4
    arrow-down
    1
    ·
    9 months ago

    People are irrational to start any career that can be done by AI without robotic advances except possibly IT.

    But in the long run, we’re all just mutated monkeys compared to AI.

  • Ben Matthews@sopuli.xyz
    link
    fedilink
    English
    arrow-up
    1
    ·
    9 months ago

    Similar issue can apply to many types of degrees. And even without AI there is already a massive oversupply of graduates on many topics, especially in China. So the whole pyramid scheme of universities needs a big rethink.

    • speck@kbin.social
      link
      fedilink
      arrow-up
      2
      ·
      9 months ago

      I’m with you on this. There’s already a systemic issue (well, issues) at play with education. LLM AI might compound it, but it won’t create it

  • BurnedDonut@ani.social
    link
    fedilink
    English
    arrow-up
    1
    ·
    edit-2
    9 months ago

    Aside from AI aspect law schools all over the world is already giving more graduates than it’s needed. There is a surplus of lawyers in the world. So first you should do a research about your country and its conditions.

    Secondly I don’t think there will be an AI that can take over law related issues/jobs from humans (lawyers, prosecutors, judges and advocates) because that AI needs to be capable of human conciseness to understand what’s going in a case. If it has achieved such capabilities than it will be sentient and we will be discussing other things than law.

    To elaborate on why I think so is laws might seem like mathematical and ridgid rules but in reality trial process (which includes the judge, prosecutors, lawyers and depending on your country jury) is what decides the outcome. And laws are applied to that process and outcome according to the human’s understanding and interpretation. So as a lawyer with over 20 years experience I don’t think that any AI will be taking over humans in law sector unless it’s a secretarial position.

    Lastly in today’s environment no matter what job you choose it’s up to you to distinguish yourself from the rest. Because in today’s environment graduation means nothing. You should focus on networking and developing yourself to become somewhat of an expert in the area of your choosing otherwise you’ll be competing with the rest for the same opportunities and AI will be the least of your worries.