Report: Apple's AI and 'Siri' Efforts Hindered by Caution, Dysfunction
Published on April 30, 2023 at 02:04AM
The Information reports: Late last year, a trio of engineers who had just helped Apple modernize its search technology began working on the type of technology underlying ChatGPT... For Apple, there was only one problem: The engineers no longer worked there. They'd left Apple last fall because "they believed Google was a better place to work on LLMs...according to two people familiar with their thinking... They're now working on Google's efforts to reduce the cost of training and improving the accuracy of LLMs and the products based on these models, according to one of those people." MacRumors summarizes the article this way. "Siri and Apple's use of AI has been severely held back by caution and organizational dysfunction, according to over three dozen former Apple employees who spoke to The Information's Wayne Ma." The extensive paywalled report explains why former Apple employees who worked in the company's AI and machine learning groups believe that a lack of ambition and organizational dysfunction have hindered âOESiriâOE and the company's AI technologies. Apple's virtual assistant is apparently "widely derided" inside the company for its lack of functionality and minimal improvement over time. By 2018, the team working on âOESiriâOE had apparently "devolved into a mess, driven by petty turf battles between senior leaders and heated arguments over the direction of the assistant." SiriâOE's leadership did not want to invest in building tools to analyse âOESiriâOE's usage and engineers lacked the ability to obtain basic details such as how many people were using the virtual assistant and how often they were doing so. The data that was obtained about âOESiriâOE coming from the data science and engineering team was simply not being used, with some former employees calling it "a waste of time and money..." Apple executives are said to have dismissed proposals to give âOESiriâOE the ability to conduct extended back-and-forth conversations, claiming that the feature would be difficult to control and gimmicky. Apple's uncompromising stance on privacy has also created challenges for enhancing âOESiriâOE, with the company pushing for more of the virtual assistant's functions to be performed on-device. Cook and other senior executives requested changes to âOESiriâOE to prevent embarassing responses and the company prefers âOESiriâOE's responses to be pre-written by a team of around 20 writers, rather than AI-generated. There were also specific decisions to exclude information such as iPhone prices from âOESiriâOE to push users directly to Apple's website instead. âOESiriâOE engineers working on the feature that uses material from the web to answer questions clashed with the design team over how accurate the responses had to be in 2019. The design team demanded a near-perfect accuracy rate before the feature could be released. Engineers claim to have spent months persuading âOESiriâOE designers that not every one of its answers needed human verification, a limitation that made it impossible to scale up âOESiriâOE to answer the huge number of questions asked by users. Similarly, Apple's design team repeatedly rejected the feature that enabled users to report a concern or issue with the content of a âOESiriâOE answer, preventing machine-learning engineers from understanding mistakes, because it wanted âOESiriâOE to appear "all-knowing."
Published on April 30, 2023 at 02:04AM
The Information reports: Late last year, a trio of engineers who had just helped Apple modernize its search technology began working on the type of technology underlying ChatGPT... For Apple, there was only one problem: The engineers no longer worked there. They'd left Apple last fall because "they believed Google was a better place to work on LLMs...according to two people familiar with their thinking... They're now working on Google's efforts to reduce the cost of training and improving the accuracy of LLMs and the products based on these models, according to one of those people." MacRumors summarizes the article this way. "Siri and Apple's use of AI has been severely held back by caution and organizational dysfunction, according to over three dozen former Apple employees who spoke to The Information's Wayne Ma." The extensive paywalled report explains why former Apple employees who worked in the company's AI and machine learning groups believe that a lack of ambition and organizational dysfunction have hindered âOESiriâOE and the company's AI technologies. Apple's virtual assistant is apparently "widely derided" inside the company for its lack of functionality and minimal improvement over time. By 2018, the team working on âOESiriâOE had apparently "devolved into a mess, driven by petty turf battles between senior leaders and heated arguments over the direction of the assistant." SiriâOE's leadership did not want to invest in building tools to analyse âOESiriâOE's usage and engineers lacked the ability to obtain basic details such as how many people were using the virtual assistant and how often they were doing so. The data that was obtained about âOESiriâOE coming from the data science and engineering team was simply not being used, with some former employees calling it "a waste of time and money..." Apple executives are said to have dismissed proposals to give âOESiriâOE the ability to conduct extended back-and-forth conversations, claiming that the feature would be difficult to control and gimmicky. Apple's uncompromising stance on privacy has also created challenges for enhancing âOESiriâOE, with the company pushing for more of the virtual assistant's functions to be performed on-device. Cook and other senior executives requested changes to âOESiriâOE to prevent embarassing responses and the company prefers âOESiriâOE's responses to be pre-written by a team of around 20 writers, rather than AI-generated. There were also specific decisions to exclude information such as iPhone prices from âOESiriâOE to push users directly to Apple's website instead. âOESiriâOE engineers working on the feature that uses material from the web to answer questions clashed with the design team over how accurate the responses had to be in 2019. The design team demanded a near-perfect accuracy rate before the feature could be released. Engineers claim to have spent months persuading âOESiriâOE designers that not every one of its answers needed human verification, a limitation that made it impossible to scale up âOESiriâOE to answer the huge number of questions asked by users. Similarly, Apple's design team repeatedly rejected the feature that enabled users to report a concern or issue with the content of a âOESiriâOE answer, preventing machine-learning engineers from understanding mistakes, because it wanted âOESiriâOE to appear "all-knowing."
Read more of this story at Slashdot.
Comments
Post a Comment