AI Chip Archives - TechGoing https://www.techgoing.com/tag/ai-chip/ Technology News and Reviews Sun, 29 Oct 2023 13:25:43 +0000 en-US hourly 1 https://wordpress.org/?v=6.4.4 Oracle is purchasing AMD Instinct MI300X AI chip https://www.techgoing.com/oracle-is-purchasing-amd-instinct-mi300x-ai-chip/ Sun, 29 Oct 2023 13:25:42 +0000 https://www.techgoing.com/?p=147436 Oracle’s cloud infrastructure is facing GPU supply constraints rather than artificial intelligence demand constraints, which may affect its near-term growth potential, according to a UBS survey report. To address the constraints of Nvidia’s lack of GPU supply, Oracle says it won’t pursue proprietary plans for the chips, but will instead focus on AMD MI300X chips, […]

The post Oracle is purchasing AMD Instinct MI300X AI chip appeared first on TechGoing.

]]>
Oracle’s cloud infrastructure is facing GPU supply constraints rather than artificial intelligence demand constraints, which may affect its near-term growth potential, according to a UBS survey report.

To address the constraints of Nvidia’s lack of GPU supply, Oracle says it won’t pursue proprietary plans for the chips, but will instead focus on AMD MI300X chips, which it plans to launch “early next year.”

According to MT Newswires, Oracle has placed an order with AMD for the Instinct MI300X. While the report did not disclose specific information such as order volume or value, it did say that Oracle is now targeting a “dual-sourcing” approach to acquire AI chips from NVIDIA and AMD.

It is said that the Instinct MI300X artificial intelligence accelerator will usher in large-scale application in mid-2024. In addition to Oracle, there are reports that IBM is looking to purchase AMD Xilinx FPGA AI solutions in an attempt to expand its NeuReality AI infrastructure.

AMD MI300X parameters: It has up to 8 XCD cores, 304 sets of CU units, 8 sets of HBM3 cores, the video memory capacity has been increased to 192GB, which is equivalent to 2.4 times of NVIDIA H100 80GB, and the HBM memory bandwidth is as high as 5.2TB /s, the Infinity Fabric bus bandwidth is also 896GB/s, which also exceeds NVIDIA H100.

THIS IS A SPONSOR PROMOTION: >>>>>>>>>>>>>

Geekwills is an online shop that connects consumers with millions of products and brands around the world with the mission to empower them to live their best lives. Geekwills is committed to offering the most affordable quality products to enable consumers and sellers to fulfill their dreams in an inclusive environment.

Geekwills

The post Oracle is purchasing AMD Instinct MI300X AI chip appeared first on TechGoing.

]]>
Microsoft announce its own AI chip Athena next month to reduce its dependence on Nvidia https://www.techgoing.com/microsoft-announce-its-own-ai-chip-athena-next-month-to-reduce-its-dependence-on-nvidia/ Tue, 10 Oct 2023 06:44:08 +0000 https://www.techgoing.com/?p=140849 According to foreign media The Information reports, Microsoft is about to unveil its own AI chip next month, thus helping to reduce the dependence on NVIDIA GPU. The foreign media claimed that the AI chip codenamed Athena, Microsoft is expected to announce this self-developed chip on November 14th-11th 17th Ignite developer conference. According to the […]

The post Microsoft announce its own AI chip Athena next month to reduce its dependence on Nvidia appeared first on TechGoing.

]]>
According to foreign media The Information reports, Microsoft is about to unveil its own AI chip next month, thus helping to reduce the dependence on NVIDIA GPU.

The foreign media claimed that the AI chip codenamed Athena, Microsoft is expected to announce this self-developed chip on November 14th-11th 17th Ignite developer conference. According to the public agenda of Ignite 2023, one of the main focuses of this year’s conference is to integrate AI into the environment.

The Information

The OpenAI’s ChatGPT has triggered an AI windstorm, driving the industry’s high demand for NVIDIA’s GPUs, which were once in short supply, and Microsoft’s Azure AI service, as well as Bing Chat, Bing Creator, and Copilot, all run on NVIDIA. Microsoft’s Azure AI services, as well as Bing Chat, Bing Creator, and Copilot, all run on NVIDIA’s H100 GPU series.

Microsoft is now developing its own AI chips, which will help save the cost of purchasing GPUs from NVIDIA and reduce its dependence on NVIDIA.

According to The Information’s report earlier this year, Microsoft actually launched the Athena project in 2019, and so far 300 people have been invested in the development of the relevant project. The Athena project has already planned a roadmap, and the first generation may be produced by TSMC’s 5-nanometer production process, it is said that some Microsoft and OpenAI employees have already obtained Athena’s engineering samples.

In order to enhance product autonomy, not only Microsoft wants to develop its own AI chips, but also industry giants such as Google and Amazon have long launched their own AI chips. On the other hand, the industry’s AI fever has also benefited NVIDIA, with an overall revenue of $11.33 billion in Q2 this year (currently about RMB 82.822 billion).

The post Microsoft announce its own AI chip Athena next month to reduce its dependence on Nvidia appeared first on TechGoing.

]]>
Microsoft to launch its first AI chip next month to reduce reliance on Nvidia https://www.techgoing.com/microsoft-to-launch-its-first-ai-chip-next-month-to-reduce-reliance-on-nvidia/ Sat, 07 Oct 2023 13:57:54 +0000 https://www.techgoing.com/?p=139809 According to foreign media reports on October 7, citing people familiar with the matter, Microsoft plans to launch the company’s first chip designed to support artificial intelligence (AI) at its annual developer conference next month. The move is the culmination of years of work and could help Microsoft reduce its reliance on Nvidia’s AI chips. […]

The post Microsoft to launch its first AI chip next month to reduce reliance on Nvidia appeared first on TechGoing.

]]>
According to foreign media reports on October 7, citing people familiar with the matter, Microsoft plans to launch the company’s first chip designed to support artificial intelligence (AI) at its annual developer conference next month. The move is the culmination of years of work and could help Microsoft reduce its reliance on Nvidia’s AI chips. Nvidia’s AI chips have been in short supply as demand surges.

Microsoft’s chips are similar to Nvidia’s graphics processing units (GPUs) and are designed for data center servers that train and run large language models, the software behind conversational AI features like OpenAI’s ChatGPT. Microsoft’s data center servers currently use Nvidia’s GPUs to power cloud customers including OpenAI and Intuit, as well as artificial intelligence capabilities in Microsoft’s productivity apps.

The chip, codenamed “Athena,” may be unveiled at Microsoft’s Ignite conference in Seattle on November 14. Athena is expected to compete with Nvidia’s flagship microprocessor H100 GPU to accelerate artificial intelligence in data centers. The custom chip has been secretly tested by teams at Microsoft and its partner OpenAI.

Microsoft began developing the Athena chip around 2019, seeking to cut costs while also hoping to increase its leverage in negotiations with Nvidia. Azure currently relies on Nvidia’s GPUs to power AI capabilities used by Microsoft, OpenAI and cloud customers. But with Athena, Microsoft can follow in the footsteps of rivals AWS and Google in offering its own AI chips to cloud users.

Athena’s performance details are unclear, but Microsoft hopes the chip will rival Nvidia’s H100. While many companies tout superior hardware and cost-efficiency, Nvidia GPUs remain the top choice for AI developers thanks to the company’s CUDA platform. Attracting users to new hardware and software will be key for Microsoft.

Microsoft’s in-house development of AI chips may also reduce its reliance on Nvidia amid tight GPU supplies. After beginning to work closely with OpenAI, Microsoft reportedly ordered at least hundreds of thousands of Nvidia chips to support OpenAI’s product and research needs. By using your own chips, significant cost savings can be achieved.

OpenAI may also be considering reducing its reliance on Microsoft and Nvidia chips. There have been reports recently that the artificial intelligence research laboratory is considering manufacturing its own artificial intelligence chips. Recent job postings on the OpenAI website also indicate that the company intends to hire talent to evaluate and co-design AI hardware.

While Microsoft and other cloud providers have no immediate plans to stop buying GPUs from Nvidia, it could be financially beneficial in the long run to convince their cloud customers to move more to in-house chips rather than Nvidia’s GPU servers. Microsoft is also working closely with AMD on its upcoming artificial intelligence chip, the MI300X. As AI workloads proliferate, this diverse approach offers a variety of options. Cloud computing rivals are employing similar strategies to avoid vendor lock-in.

Amazon and Google have strategically integrated their AI chips into promotions for their cloud businesses. Amazon provided financial support to OpenAI rival Anthropic on the condition that Anthropic would use Amazon’s artificial intelligence chips, called Trainium and interentia. Meanwhile, Google Cloud announced that customers including artificial intelligence image developer Midjourney and Character AI are using the company’s tensor processing units.

As artificial intelligence chips become an essential part of data centers, the rewards for betting on the space could be high. With this development, Microsoft will also join the ranks of competitors fighting for market share in the accelerating field of artificial intelligence chips. With Athena, Microsoft can offer cloud customers more choices while charting a more independent course on next-generation AI infrastructure.

The post Microsoft to launch its first AI chip next month to reduce reliance on Nvidia appeared first on TechGoing.

]]>
Due to supply chain capacity upgrades TSMC’s AI chips to become more expensive https://www.techgoing.com/due-to-supply-chain-capacity-upgrades-tsmcs-ai-chips-to-become-more-expensive/ Tue, 26 Sep 2023 05:29:07 +0000 https://www.techgoing.com/?p=135690 According to Taiwan’s “United Daily News”, as TSMC’s supply chain expands CoWoS advanced packaging production capacity, the increase in the price of these interlayer films will eventually push up the cost of AI chips produced by the company. TSMC is investing billions of dollars to upgrade its packaging capabilities due to strong demand for AI […]

The post Due to supply chain capacity upgrades TSMC’s AI chips to become more expensive appeared first on TechGoing.

]]>
According to Taiwan’s “United Daily News”, as TSMC’s supply chain expands CoWoS advanced packaging production capacity, the increase in the price of these interlayer films will eventually push up the cost of AI chips produced by the company.

TSMC is investing billions of dollars to upgrade its packaging capabilities due to strong demand for AI products. The company announced in July this year that it would invest US$2.89 billion to build a new chip packaging plant. The company aims to increase packaging production capacity to 30,000 pieces per month by the end of 2024.

Note: CoWoS is a technology that stacks multiple chip dies together. This technology can place these chip dies on a silicon interlayer to improve their performance.

It is reported that TSMC is purchasing CoWoS machines from equipment factories such as Xinyun, Wanrun, Hongsu, Titanium, and Qunyi. These companies are expected to become the biggest beneficiaries of the growth in demand for TSMC’s CoWoS products. They are expected to complete delivery and installation in the first half of next year. .

According to industry sources, TSMC’s current CoWoS advanced packaging monthly production capacity is about 12,000 pieces. After the previous expansion, the original monthly production capacity will gradually expand to 15,000 to 20,000 pieces. After that, additional equipment will be added to the factory, which will increase TSMC’s production capacity. The monthly production capacity can reach more than 25,000 pieces, or even close to 30,000 pieces, which will increase TSMC’s ability to undertake AI-related orders. Due to related production capacity upgrades, TSMC’s AI chips will also see price increases.

In addition, Taiwanese media pointed out that NVIDIA is currently the largest customer of TSMC’s CoWoS advanced packaging, with order volume accounting for 60% of production capacity. Recently, NVIDIA has expanded its orders in response to strong demand for AI computing, and urgent orders from customers such as Amazon and Broadcom have also begun to emerge.

Taking into account the urgent demand from customers for CoWoS advanced packaging production capacity, TSMC has once again pursued 30% of the order for equipment manufacturers, and requires that the delivery and installation be completed by the end of the second quarter of next year, and mass production will begin in the second half of next year.

The post Due to supply chain capacity upgrades TSMC’s AI chips to become more expensive appeared first on TechGoing.

]]>
TSMC urgently buys more CoWoS packaging equipment as Nvidia chases another AI chip https://www.techgoing.com/tsmc-urgently-buys-more-cowos-packaging-equipment-as-nvidia-chases-another-ai-chip/ Mon, 25 Sep 2023 02:41:17 +0000 https://www.techgoing.com/?p=135122 With Nvidia AI chip demand hot, foundry TSMC also all the way to increase production capacity. According to Taiwan media “Economic Daily News” reported that TSMC CoWoS (Note: Chip-on-Wafer-on-Substrate) advanced packaging production capacity is full, actively expanding production, rumors of large customers NVIDIA expand AI chip orders, coupled with AMD, Amazon and other large manufacturers […]

The post TSMC urgently buys more CoWoS packaging equipment as Nvidia chases another AI chip appeared first on TechGoing.

]]>
With Nvidia AI chip demand hot, foundry TSMC also all the way to increase production capacity.

According to Taiwan media “Economic Daily News” reported that TSMC CoWoS (Note: Chip-on-Wafer-on-Substrate) advanced packaging production capacity is full, actively expanding production, rumors of large customers NVIDIA expand AI chip orders, coupled with AMD, Amazon and other large manufacturers rush single emergence, TSMC this rush to find the equipment supplier to purchase additional CoWoS equipment, in the existing In addition to the existing target of increasing production, the order quantity of the equipment has been increased by another 30%, highlighting that the current AI market continues to be hot.

Reportedly, TSMC seeks the assistance of Xinwei, Wanrun, Hong plastic, titanium rise, Qunyi and other equipment manufacturers to expand the reinforcement of CoWoS equipment, is expected to be completed in the first half of next year and installed, the relevant equipment factory busy days, not only has previously taken the TSMC original expansion target machine orders, and now again to be pursued by the order of 30%, the second half of the year, revenues will grow significantly, but also led to the relevant equipment factory in the orders on hand visibility up to the first half of next year! The first half of next year.

Industry sources revealed that TSMC’s current CoWoS advanced packaging monthly production capacity of about 12,000 pieces, previously launched after the expansion of production, the original monthly production capacity gradually expanded to 15,000 to 20,000 pieces, and now additional equipment stationed in the monthly production capacity of up to more than 25,000 pieces, and even close to 30,000 pieces, making TSMC to undertake AI-related orders greatly increased energy.

None of the equipment makers that have been tipped off would comment on the order dynamics. Sources close to the matter revealed that with the substantial development of AI computing applications, including assisting machines to learn on their own, training large-scale language models (LLM) and AI inferences, etc., and landing in the field of self-driving cars and smart factories, AI chip demand will maintain strong growth.

The report also said that NVIDIA, AMD and other large customers have increased in the third quarter of the wafer foundry chip volume, effectively boosting TSMC 7nm and 5nm advanced process capacity utilization, but CoWoS advanced packaging capacity is in short supply, has become the biggest bottleneck in the production chain.

TSMC’s President, Mr. Wei Zhejia, mentioned in a recent press conference that TSMC has been actively expanding its CoWoS advanced packaging capacity in the hope that it can alleviate the pressure of capacity constraints after the second half of 2024. It is understood that TSMC has squeezed out plant space to increase CoWoS capacity in Zhuko, Zhongke, Nanke, Longtan, etc., and the Zhunan packaging and testing plant will also synchronize the construction of advanced packaging production lines such as CoWoS and TSMC SoIC.

Industry sources pointed out that TSMC began to start CoWoS advanced packaging production expansion plan in the second quarter, the first batch of order procurement in May for the equipment factory, the equipment is expected to be in place at the end of the first quarter of next year and installed to complete the monthly production capacity of CoWoS advanced packaging can then be increased to 15,000 to 20,000 pieces. Even though TSMC has been vigorously expanding CoWoS capacity, the outbreak of client demand has led TSMC to place additional orders for the equipment co-founders.

Equipment practitioners pointed out that NVIDIA is currently TSMC’s largest customer for CoWoS advanced packaging, with orders accounting for 60% of production capacity. Recently, due to strong demand for AI computing, NVIDIA has expanded its orders, and urgent orders from customers such as AMD, Amazon, Broadcom, etc., have also begun to emerge. Considering the urgent demand for CoWoS advanced packaging capacity from customers, TSMC has once again pursued an order for 30% of the capacity from equipment factories, and requested to complete the delivery and installation before the end of the second quarter of next year, and to enter mass production in the second half of next year.

The post TSMC urgently buys more CoWoS packaging equipment as Nvidia chases another AI chip appeared first on TechGoing.

]]>
TSMC’s AI chip supply crunch is expected to improve by the end of 2024 https://www.techgoing.com/tsmcs-ai-chip-supply-crunch-is-expected-to-improve-by-the-end-of-2024/ Fri, 08 Sep 2023 07:16:56 +0000 https://www.techgoing.com/?p=129941 According to Nikkei Asia reports, TSMC said on Wednesday, that due to a number of companies advancing to ChatGPT as a representative of the AI model, the supply of AI chips tight situation will take about 18 months to ease. At the recent SEMICON industry event, TSMC Chairman Liu Deyin said that the current supply […]

The post TSMC’s AI chip supply crunch is expected to improve by the end of 2024 appeared first on TechGoing.

]]>
According to Nikkei Asia reports, TSMC said on Wednesday, that due to a number of companies advancing to ChatGPT as a representative of the AI model, the supply of AI chips tight situation will take about 18 months to ease.

At the recent SEMICON industry event, TSMC Chairman Liu Deyin said that the current supply constraints for AI chips are only temporary and are expected to ease significantly by the end of 2024. Liu Deyin said that TSMC’s ability to increase AI chip production capacity is limited by the number of tests and packages at TSMC and the lack of space layout capability for complex chips at this stage.

Liu Deyin explained that this year, the demand for CoWoS chip packaging suddenly increased, tripled, now we can not 100% meet customer demand, but we hope to meet 80% of the orders.

At its recent quarterly presentation, TSMC management committed to doubling its core capacity by the end of 2024. TSMC will invest $2.9 billion (currently about RMB 21.257 billion) to build a new chip testing and packaging facility.

Mark Liu believes that the semiconductor industry must respond to the “paradigm shift”. In order to further increase the number of transistors in a chip, manufacturers must increasingly use complex spatial arrangements.

TSMC management said that flagship AI gas pedals can now incorporate up to 100 billion transistors, a number that will increase tenfold to more than a trillion in the next decade. Such advances can be realized by combining multiple crystals in a single package.

The post TSMC’s AI chip supply crunch is expected to improve by the end of 2024 appeared first on TechGoing.

]]>
Gartner: Global AI chip revenue will reach $53 billion in 2023 and $119.4 billion in 2027 https://www.techgoing.com/gartner-global-ai-chip-revenue-will-reach-53-billion-in-2023-and-119-4-billion-in-2027/ Sat, 26 Aug 2023 05:01:53 +0000 https://www.techgoing.com/?p=125958 According to market research firm Gartner’s latest forecast, the global hardware sales revenue for AI is expected to grow by 20.9% year-on-year in 2023, reaching $53.4 billion (currently about RMB 388.752 billion). According to Gartner, the development of generative AI and the widespread use of various AI-based applications in data centers, edge infrastructures, and endpoint […]

The post Gartner: Global AI chip revenue will reach $53 billion in 2023 and $119.4 billion in 2027 appeared first on TechGoing.

]]>
According to market research firm Gartner’s latest forecast, the global hardware sales revenue for AI is expected to grow by 20.9% year-on-year in 2023, reaching $53.4 billion (currently about RMB 388.752 billion).

According to Gartner, the development of generative AI and the widespread use of various AI-based applications in data centers, edge infrastructures, and endpoint devices will require the deployment of GPUs and “optimized semiconductor devices,” which will drive the production and deployment of AI chips.

▲ Source Gartner

Gartner expects AI semiconductor revenues to continue to grow at a double-digit rate over the forecast period, increasing by 25.6% to $67.1 billion in 2024 (Currently about RMB 488.488 billion), and by 2027, AI chip revenues are expected to more than double the 2023 market size to $119.4 billion (currently about RMB 869.232 billion).

Gartner

As the use of AI workloads in the enterprise matures, many more industries and IT organizations will deploy systems that include AI chips, Gartner said. In the consumer electronics market, Gartner estimates that the value of AI-enabled application processors for devices will reach $1.2 billion by the end of 2023 (currently ~RMB 8,736 million), up from $558 million in 2022.

Gartner believes that the need for efficient and optimized designs that support the cost-effective execution of AI workloads will lead to an increase in the deployment of custom-designed AI chips. For many organizations, the mass deployment of custom AI chips will replace the current dominant chip architecture, the Discrete GPU, for a variety of AI workloads based on generative AI technologies.

The post Gartner: Global AI chip revenue will reach $53 billion in 2023 and $119.4 billion in 2027 appeared first on TechGoing.

]]>
UK plans to spend $130 million on GPUs as a national AI research resource https://www.techgoing.com/uk-plans-to-spend-130-million-on-gpus-as-a-national-ai-research-resource/ Mon, 21 Aug 2023 04:53:48 +0000 https://www.techgoing.com/?p=124136 In order to catch up in the global computing power race, British Prime Minister Rishi Sunak plans to spend 100 million pounds to buy thousands of high-performance artificial intelligence chips. British government officials are understood to have been in discussions with IT giants such as Nvidia, AMD and Intel to procure chips for a so-called […]

The post UK plans to spend $130 million on GPUs as a national AI research resource appeared first on TechGoing.

]]>
In order to catch up in the global computing power race, British Prime Minister Rishi Sunak plans to spend 100 million pounds to buy thousands of high-performance artificial intelligence chips.

British government officials are understood to have been in discussions with IT giants such as Nvidia, AMD and Intel to procure chips for a so-called “national AI research resource” as part of Rishi Sunak’s ambition to make the UK a global leader in AI.

The work, led by science funding body UK Research and Innovation, is believed to be in advanced stages of negotiations with Nvidia. The UK will buy 5,000 GPUs from Nvidia, the company’s chips that power artificial intelligence models such as ChatGPT.

According to people familiar with the matter, £100 million has already been disbursed. However, the spending is seen as insufficient to meet the government’s AI ambitions, and officials are urging Chancellor of the Exchequer Jeremy Hunt to allocate more funding in the coming months.

GPUs are a key component in building AI systems like ChatGPT, whose latest version was trained using as many as 25,000 Nvidia chips.

Rishi Sunak outlined plans for the UK to become an AI superpower, but the UK lags badly behind the US and Europe in terms of the computing resources needed to train, test and operate complex models.

A government review published this year criticized the UK for lacking “dedicated AI computing resources”, with fewer than 1,000 high-end Nvidia chips at the disposal of researchers. The report recommends at least 3,000 “top-spec” GPUs be available as soon as possible.

In March, Hunter agreed to set aside 900 million pounds ($1.15 billion) to buy computing resources, although most of the money is expected to go to research and development of traditional supercomputers.

Just over 50 million pounds ($63.7 million) is believed to be allocated to AI resources, but is expected to rise to 70 million to 100 million pounds as the world scrambles for AI-enabled chips (approximately between $89 million and $130 million).

British officials are likely to press the government to reveal more funding in the autumn budget statement, which is likely to be released during the AI ​​safety summit in November.

Last week, media reports said Saudi Arabia had purchased at least 3,000 Nvidia H100 processors, the company’s high-end components used to train artificial intelligence, for $40,000 each. Tech giants such as Microsoft, Amazon and Google are also racing to buy tens of thousands of AI chips.

It is unclear what type of chips the UK is negotiating to buy. The GPUs will be used to build an AI research resource that the government hopes will be operational next summer.

In addition, British officials are weighing the merits of “sovereign chatbots”, a publicly funded language model similar to ChatGPT, and looking for ways to facilitate the deployment of artificial intelligence in public services such as the UK’s National Health Service (NHS).

Although Nvidia’s chips are widely used to train artificial intelligence and are seen as the clear frontrunner, the British government has also been in related discussions with various microchip companies.

Rishi Sunak is promoting the UK as a center for setting global standards for the safe development of artificial intelligence. He has spearheaded plans for the AI Security Summit, which is expected to be held at Bletchley Park, the center of World War II code-breaking. It is hoped that the event will prompt an international agreement between governments and top AI companies to develop the technology.

A government spokesman said: “We are committed to supporting the UK’s thriving computing environment to maintain our global leadership in science, innovation and technology.”

Nvidia declined to comment.

The post UK plans to spend $130 million on GPUs as a national AI research resource appeared first on TechGoing.

]]>
LG releases UP Home Appliances 2.0 with AI chips https://www.techgoing.com/lg-releases-up-home-appliances-2-0-with-ai-chips/ Thu, 27 Jul 2023 02:05:26 +0000 https://www.techgoing.com/?p=116765 According to BusinessKorea, LG Electronics recently launched the smart home solution “UP Home Appliances 2.0”, which allows home appliances including washing machines and dryers to install and uninstall apps like smartphones. LG has previously focused on making premium hardware products, and the new plan expands its business to offer services and subscriptions for those products. […]

The post LG releases UP Home Appliances 2.0 with AI chips appeared first on TechGoing.

]]>
According to BusinessKorea, LG Electronics recently launched the smart home solution “UP Home Appliances 2.0”, which allows home appliances including washing machines and dryers to install and uninstall apps like smartphones. LG has previously focused on making premium hardware products, and the new plan expands its business to offer services and subscriptions for those products.


▲ Picture source LG official website

Ryu Jae-cheol, head of LG Electronics’ Home Appliances Division, said: “Customers tend to use home appliances of different types and functions according to their lifestyle. We will provide devices that are precisely tailored to each individual’s lifestyle.” UP appliances, which debuted last year, are able to update software and add new functions, just like updating a smartphone’s operating system.

The UP Home Appliance 2.0 released by LG aims to achieve “ultra-personalization”. Just like people can install and delete necessary applications on their smartphones, UP Home Appliance 2.0 will provide similar support for devices.

It is learned from officials that in order to achieve this goal, LG will apply AI chips and operating systems in home appliances. For this reason, the company has been researching and developing internally for more than three years. The new features will first be applied to the washing machines and dryers released yesterday.

LG has also introduced a device subscription system that combines hardware products with non-hardware services, and consumers can choose a three- to six-year subscription period. “From a consumer’s point of view, it reduces the initial product purchase cost, and it can be cheaper when hardware and software are ordered together,” an LG Electronics official explained.

The post LG releases UP Home Appliances 2.0 with AI chips appeared first on TechGoing.

]]>
Samsung partners with ‘semiconductor legend’ Jim Keller to accelerate AI chip development https://www.techgoing.com/samsung-partners-with-semiconductor-legend-jim-keller-to-accelerate-ai-chip-development/ Thu, 20 Jul 2023 07:16:54 +0000 https://www.techgoing.com/?p=115106 IT home July 20 news, according to Business Korea reports, Samsung Electronics’ semiconductor foundry department has with Tenstorrent and Groq launched a chip research project, samsung foundry department within the design service team is responsible for these research tasks. In 2021, semiconductor industry god Jim Keller, known as the “Silicon Immortal,” officially joined Tenstorrent, a […]

The post Samsung partners with ‘semiconductor legend’ Jim Keller to accelerate AI chip development appeared first on TechGoing.

]]>
IT home July 20 news, according to Business Korea reports, Samsung Electronics’ semiconductor foundry department has with Tenstorrent and Groq launched a chip research project, samsung foundry department within the design service team is responsible for these research tasks.

In 2021, semiconductor industry god Jim Keller, known as the “Silicon Immortal,” officially joined Tenstorrent, a little-known AI chip startup, as chief technology officer and president. Technology Officer and President, becoming CEO in January 2023 and joining the company’s board of directors.

In 2023, Keller and Sam Zeloof (who built his own 1,200-transistor CPU in his garage and was known as the “Silicon Prodigy”) founded Atomic Semi, a startup seeking to build a small, fast semiconductor fab. Samsung and the “Silicon Prodigy.

▲ Image source; Tenstorrent

Samsung Electronics will work with Tenstorrent and Groq to develop AI semiconductor chips for use in advanced IT equipment, sources said.

If this task leads to mass production, it is expected that its chips will be manufactured on Samsung’s sub-5-nn EUV line and packaged in a 2.5D packaging facility.

Industry experts speculate that if Samsung successfully completes the project with the two companies it will have a significant impact on the foundry market.

As the size of the AI market driven by ChatGPT continues to grow, Samsung’s foundry, an early adopter of the partnership, could reap significant profits if the two startups’ positions are strengthened.

In addition, both companies have a lot of potential in the global AI industry, and Jim Keller, needless to say, is a legend in the semiconductor space, having led the design of state-of-the-art semiconductors at Apple, Tesla, Intel and AMD.

IT Home has learnt that Groq is a semiconductor company founded in 2016 by former Google employee Jonathan Ross, and the company’s recent partnership with Meta has fuelled speculation that Groq could pose a threat to Nvidia.

The post Samsung partners with ‘semiconductor legend’ Jim Keller to accelerate AI chip development appeared first on TechGoing.

]]>