This is a cache of https://developers.slashdot.org/story/24/10/29/2223255/more-than-a-quarter-of-new-code-at-google-is-generated-by-ai. It is a snapshot of the page at 2024-10-30T01:12:53.318+0000.
More Than a Quarter of New Code At Google Is Generated By AI - Slashdot

Slashdot is powered by your submissions, so send in your scoop

 



Forgot your password?
typodupeerror
×
Programming AI Google

More Than a Quarter of New Code At Google Is Generated By AI 15

Google has integrated AI deeply across its operations, with over 25% of its new code generated by AI. CEO Sundar Pichai announced the milestone during the company's third quarter 2024 earnings call. The Verge reports: AI is helping Google make money as well. Alphabet reported $88.3 billion in revenue for the quarter, with Google Services (which includes Search) revenue of $76.5 billion, up 13 percent year-over-year, and Google Cloud (which includes its AI infrastructure products for other companies) revenue of $11.4 billion, up 35 percent year-over-year. Operating incomes were also strong. Google Services hit $30.9 billion, up from $23.9 billion last year, and Google Cloud hit $1.95 billion, significantly up from last year's $270 million. "In Search, our new AI features are expanding what people can search for and how they search for it," CEO Sundar Pichai says in a statement. "In Cloud, our AI solutions are helping drive deeper product adoption with existing customers, attract new customers and win larger deals. And YouTube's total ads and subscription revenues surpassed $50 billion over the past four quarters for the first time."

More Than a Quarter of New Code At Google Is Generated By AI

Comments Filter:
  • by silentbozo ( 542534 ) on Tuesday October 29, 2024 @07:34PM (#64904809) Journal

    Does this mean 25% of the codebase at Google is completely autonomous, from inception, design, implementation, testing, deployment, and maintenance?

    Or does this mean Google is using their own version of Github Copilot as an autocomplete resource alongside normal engineering activities, and they're attributing 25% of the newly implemented codebase by lines, to "AI" generated activities?

    Given that Google is trying to push "AI" as a sellable feature, I'd want to know the actual breakdown of how it is being dogfooded, and the ROI. Instead of talking about how much new code is "AI" generated, how many engineer hours are they saving, and are they getting equivalent or better level of deliverables (features, tests, tooling), with an equiavlent, or better delivery timeline?

    https://www.businessinsider.co... [businessinsider.com]

    "Pichai said using AI for coding was "boosting productivity and efficiency" within Google. After the code is generated, it is then checked and reviewed by employees, he added.

    "This helps our engineers do more and move faster," said Pichai. "I'm energized by our progress and the opportunities ahead, and we continue to be laser focused on building great products.""

    This basically sounds like Github Copilot. You're pairing with an LLM, but you still need humans in the loop to judge and tweak the output, do code reviews, etc. Here the benefit is that Google has large internal repositories to train against. Will external customers be able to benefit from this, or are these proprietary Google-only models?

    • by r1348 ( 2567295 )

      Don't you just love when the LLM you're forced to work with is trying to use a string as an array index? Actual LLM coding experience.

      • by ls671 ( 1122017 )

        It works in php, maybe that's where it got the idea from.
        https://www.php.net/manual/en/... [php.net]

      • Maybe switch to a cool language that lets you do things like that? :-D (Also, is this yet another tale from GPT-3.5 you keep repeating? Have you tried e.g. Qwen 7b-coder, etc.?

        More seriously, code generation isn't the best use of LLMs in programming IME. They kick ass at code reviews, if you feel like being humbled. They are also great at helping to spitball stuff and getting pointers to authoritative sources of information.
    • Does this mean 25% of the codebase at Google is completely autonomous, from inception, design, implementation, testing, deployment, and maintenance?

      You could even take it to mean terraform, kubernetes manifest, helm chart, etc. "code", or anything else from that whole ecosystem of plumbing-as-code job creation software.

      • Comments technically qualify as "code" for the purposes of managerial presentations. Which my "upline" didn't bother to mention during their quarterly E level meeting, when they said our code is 40% AI generated. (We use an AI system to help comment code as part of an automated documentation repository.)
    • Does this mean 25% of the codebase at Google is completely autonomous, from inception, design, implementation, testing, deployment, and maintenance?

      In the investor announcement [blog.google], it's a somewhat disjointed statement at the end of a long line of more concrete statements. You can imagine that it was put there by some ambitious ladder-climbing manager, who did some "research" motivated to make a phrase into the earnings announcement. Having made a phrase into the earnings announcement, his profile is now raised compared to his fellow comrades (or so he thinks, while those around him roll their eyes. But it also might actually work in helping him get promot

    • by gweihir ( 88907 )

      Does this mean 25% of the codebase at Google is completely autonomous, from inception, design, implementation, testing, deployment, and maintenance?

      Sounds like it. Or maybe it is only experimental code, i.e. the number is very misleading? Basically a lie?

  • by GameboyRMH ( 1153867 ) <<moc.liamg> <ta> <hmryobemag>> on Tuesday October 29, 2024 @07:50PM (#64904845) Journal

    The best-case scenario here is that Google counted every single character that came out when a coder hit the autocomplete key to enter the next couple words of their code (probably variable and function names mostly, like many code-oriented text editors have done for decades), and they're selling it to investors to make it look like they're 1/4 way to kicking all their coders to the curb and becoming a fully-automated post-human-labor company.

    Wouldn't be the first time something like that happened, the health care megacorp I used to work for once put out a press release where they said they were using AI for processes which I can assure you were 100% AI-free at the time (like the rest of the entire software suite it was part of) and I would bet still is. I pointed it out in our company chat and joked about whether we should be fitting our servers with GPUs or NPUs and got lots of laughs.

    • Any company with a datascience team (if they also needed to impress people) suddenly turned their data science team into AI when talking about it publicly.

      Why not? Even A* is AI.
  • by Anonymous Coward

    Who made fun of the writer’s strike, your day is coming. AI doesn’t have to be perfect, just good enough. Don’t think the suits aren’t paying attention. Your replacement is already planned.

Truth has always been found to promote the best interests of mankind... - Percy Bysshe Shelley

Working...