laitimes

Microsoft is too ruthless!Low memory does not allow Windows 11 AI, starting at 16GB?

Microsoft is too ruthless!Low memory does not allow Windows 11 AI, starting at 16GB?

There is no doubt that AI technology has become the next inflection point for personal computing devices such as smartphones and computers. In the field of upstream chips, brands such as Qualcomm and Intel have completed the product layout of NPU, AI chips and other products, and are moving towards the implementation of AI technology. On the product side, since the end of the 2023 Snapdragon Summit, major brands except Apple have rushed to launch specific products using AI large language model technology. At CES 2024, which ended some time ago, well-known computer brands such as Samsung, Asus, Lenovo, and Dell all released their own "AI computer" products, and at Samsung's "Unpacked" event, Samsung released its first batch of "AI phones" - Galaxy S24 series.

Even Microsoft, which is often slow to take care of the compatibility of older hardware, has decided to add more AI features in the future of Windows 11/12. At the 2023 Ignite 2023 developer conference, Microsoft CEO Nadella even said that Microsoft has become a "Copilot company."

Microsoft is too ruthless!Low memory does not allow Windows 11 AI, starting at 16GB?

Image source: Microsoft

But in terms of hardware configuration, some computers don't seem to be ready for the AI era.

According to a quote from market research agency TrendForece:

Microsoft has set the baseline of memory in AI PCs to 16 GB. In the long term, TrendForce expects AI-powered PCs to drive an increase in annual demand for PC memory, which is further fueled by consumer upgrade trends.

In other words, if your computer has less than 16GB of RAM, Microsoft's Copilot AI is likely to be out of the way.

16GB isn't that high?

For digital enthusiasts like you and me, the baseline of 16GB of RAM doesn't seem like much, given that less than 22% of gamers with less than 16GB of RAM were in the December 2023 Steam Hardware Survey. But like just said, this number can only represent Steam players. Given the reality, this 22% is not universal. Besides, Microsoft's Copilot AI is more oriented to work scenarios, and it is not scientific to directly use PC players' high-end gaming PCs as references.

Microsoft is too ruthless!Low memory does not allow Windows 11 AI, starting at 16GB?

Image source: Steam

Considering that most companies have a 3-year replacement cycle for business PCs, we might as well use the configuration of 3 years ago, that is, 2020. According to Digital Trened, the average consumer's laptop in 2020 typically only came with 8GB of RAM. For example, in the bidding announcement of a data center in a city in 2020, the purchased computer memory was only 8GB, which did not meet the "baseline".

Microsoft is too ruthless!Low memory does not allow Windows 11 AI, starting at 16GB?

Image source: Chongqing Municipal Government Procurement Network

Of course, considering the actual performance of Windows 11, Microsoft and Copilot are not really picky and picky: according to data published by Statcounter, as of November 2023, Windows 11 accounts for only 26.66% of the system. At the same time, Windows 10 accounts for 68% of the total, and all the other system versions together account for less than 50% of Windows 10.

In connection with the fact that some time ago Microsoft was trying every means to "drive" users to use Windows 11 instead, and even did not hesitate to "open the floodgates" to allow users to activate Windows 11 in their own way, Xiaolei does not think that Microsoft will design obstacles for future Windows 11 users in memory, and it is very likely that it will release the memory limit at the last moment, or allow users to upgrade in full, but it does not guarantee the running results of key applications.

But then again, now that AI large language models work well on smartphones, and Qualcomm even demonstrated extremely efficient mobile AI large language models at the 2023 Snapdragon Summit, is it true that an 8GB computer can't run AI? Or is there any way to make an AI large language model application run on an 8GB phone?

There are ways to do this, such as hybrid AI large language models that use a mix of local and online models.

Is the "distributed" hybrid model the future?

In terms of where they run, current AI large language model applications can be roughly divided into three categories: local operation, remote operation, and hybrid operation. Local operation is easy to understand, that is, the entire model runs in the system that is currently interacting with it, and AI large language model applications that can run offline and offline models deployed by professional players on their own computers all fall into this category.

Remote run, on the other hand, refers to situations where the input and output devices and the devices running large language models are not on the same system. AI applications that run on a remote server through a web page, WeChat Mini Program, or the entire model are considered to be remotely running.

Microsoft is too ruthless!Low memory does not allow Windows 11 AI, starting at 16GB?

Image source: Microsoft

Local operation and remote operation have their own advantages and disadvantages, such as local operation can be disconnected from the network, keeping user data local to the device, reducing the risk of data leakage, and the local model does not depend on the network speed in terms of response speed, which can significantly improve the user experience. Although the remote server needs to send data to the remote server and wait for the output to be returned, because the remote server is generally an HPC cluster, the computing power and the user's local resources are not the same order of magnitude, so the generative model application has faster generation speed and better computing accuracy.

The hybrid operation model, on the other hand, takes the best of both worlds and provides the flexibility to distribute computing tasks between on-premises and the cloud based on the specific needs of the application. For example, some tasks that require high response speed or require immediate processing can be performed locally, while those that require more computing power for analysis and learning can be performed in the cloud. This flexibility allows AI systems to better adapt to different use cases and provide more customized services.

First, the hybrid model has significant advantages in terms of data privacy and security. With the frequent occurrence of data breaches, consumers are increasingly concerned about the security of their personal information. In a hybrid model, sensitive data can be processed locally without having to be transferred to the cloud, which greatly reduces the risk of data breaches. For the average consumer, this means they can use AI-based applications such as smart home devices and personal health data analysis tools with greater peace of mind without worrying too much about their privacy being violated.

Second, the hybrid model provides better performance and responsiveness. For applications that require real-time reactions, such as voice assistants and online gaming, local computing provides faster processing speeds and avoids the latency associated with data transfer to the cloud. For those tasks that require complex computing and large data analysis, such as image recognition and large language models, they can be processed by leveraging the powerful computing power of the cloud to provide efficient AI services. For consumers, this combination of on-premise responsiveness and cloud-based computing can provide a smoother and more efficient experience in different scenarios.

Microsoft is too ruthless!Low memory does not allow Windows 11 AI, starting at 16GB?

Image source: Microsoft

Flexibility is another important benefit of the hybrid model. It allows consumers to choose how their data is processed according to their needs and preferences. At home, for example, consumers may be more inclined to use local computing to process personal data to ensure privacy, while they may rely more on cloud services for information and entertainment when on the go. This flexibility allows the hybrid model to adapt to the diverse needs of different consumers.

In addition, the hybrid model is more friendly for device compatibility and update maintenance. Since cloud computing can provide the latest AI models and algorithms, consumers can enjoy the latest AI technology without having to change local devices frequently. At the same time, legacy devices can still operate efficiently for some local computing tasks, which reduces the pressure on equipment updates and extends the life of the equipment, which is an important advantage for cost-effective consumers.

Even Microsoft itself has repeatedly expressed its emphasis on hybrid AI large language models, and under the "dead line" of 8GB of memory, I believe that Microsoft is absolutely capable of providing basic AI decision-making capabilities and making AI large language models an "inclusive" feature.

However, given the rapid development of flash memory technology in recent years, the 16GB baseline may also become the standard equipment for office computers in the future, after all, it is about AI and "productivity". It is believed that in the face of the wave of AI PCs, 16GB of RAM is likely to become the "starting configuration" of the notebook market in 2024.

Microsoft is too ruthless!Low memory does not allow Windows 11 AI, starting at 16GB?

Image courtesy of Apple

Of course, the M3 Macbook Pro, which costs up to 12,999 yuan but still only has 8GB of RAM, is not included in the "AI Inclusive Project".

Read on