Is being concerned about your privacy really a misconception here?
There's nothing herently good or bad about everyone getting access to or becoming capable of something.
If folks don't understand that, then yes I'd say it's a pretty big misconception and needs to be better marketed as a key feature.
And if the AI PC is just a regular PC with a cloud bot integrated, then ... what even is the point? You can already do the remote chatbot thing with a regular PC, privacy nightmares included!
Even Apple Intelligence is getting a lot of negative feedback by the review crowd due to its limitations like being unable to summarize a very large document (which is pretty much the point of such a feature).
The problem is that AI has few well-defined use cases and a mountain of expectations, and this really shows in the execution by these companies. It’s hard to build good products when the requirements are “we don’t really know”
To trick people into buying new hardware, lest they get left behind in the AI race
Unrealistic for now because running ML is slow but even if yes, even laypeople know by now you can break an LLM to do what it isn't supposed to. Since this LLM has unlimited access to your personal data to be useful, if I get to it I don't even need to bypass any secure enclaves or what not because it will tell me things I ask for in plain <insert your language>
All eggs in one basket
It is normal for most tech to have an exploratory period, where its potential is clear, but its immediate economic impact is negative during iterations of product-market fit adaptation.
Normally, the producer of new tech eats most or all of the risk and cost of the search for product-market fit.
But some tech is so compelling, that customers feel the strategic need to participate in the discovery loop too.
Obviously, there are upfront costs and risk deploying/trying tech that is still hit-and-miss. But during a sea change, there is also risk in not experimenting and adopting/adapting early.
> AI Ready for Tomorrow
> Ready for the ever-evolving possibilities of AI? This Swift Go AI PC integrates Intel’s new dedicated AI engine—Intel® AI Boost—with Acer’s own AI solutions, for more intuitive and enjoyable AI experiences.
It really seems like some kind of stupid joke.
you mean in a weird generative AI way?
I.e. the car has four tires and a steering wheel.
https://ludic.mataroa.blog/blog/i-will-fucking-piledrive-you...
https://chatgpt.com/share/6743194c-ae00-8001-af3c-0915007579...
Just working out the age of my personal desktop computer has a Ship of Theseus problem - but safe to say 20+ years. However, it now has a graphics card with an RTX 3060 with 12 GB of GPU, and NVMe SSDs, and can run inference on 7B parameter 4 bit quantised Transformer LLMs, and generate images with large diffusion models. I've also used it for many applications that would count as AI before the latest generative AI hype cycle.
So is it an AI PC? At what point did it become an AI PC? Or is a self-built machine in which you swap parts inherently never an AI PC?
Given the fact that it is so amorphously defined, I would consider the term to be purely marketing fluff.
There's not a specific line in the sand, although tasking it with machine learning (in which outcomes improve based on collecting runtime inputs, rather than based only on its creator adding capabilities) would be a decent one. That's fairly human-like, while non-ML workloads are more plant-like.
1) A box on the screen where I can chat with one to do ideation or really anything I want.
2) A command-driven approach where I hit a hotkey, type a prompt and the response is dumped out in front of me, possibly I had some text selected which heavily influences the response.
These are both pretty cool tbh and developers will have a field day for years finding sensible ways to incorporate them into programs.
None of this has anything to do with driving the hardware upgrade cycle since most of the models are running in the cloud. But driving hardware upgrades is what these marketing people are really trying to do when they talk about AI PC. They are irrelevant people, but they need to convert everything they see into a reason to buy a new PC.
That's what they get paid for. Monkey marketer see trend, monkey marketer do marketing. Monkey steal your attention.
Maybe a LLM will replace THEM soon. After all it's basically a digital version of the million monkeys on typewriters...
Hah, no they very much aren't, as Intel are an insignificant player at the moment. As long as Intel is as far behind as they are, they'd rather overall investment in hardware goes down. The second Intel comes out with a leading chip, you'll suddenly see them come up with a study with the opposite result.
Why is it that we all woke up one morning and every corporation is suddenly saying this same thing?
Essentially FLOSS tools must be learned in time, typically at school, than you can profit for life, commercial software is an endless low-learning process full of frustration to do anything. No LLM can solve that, it's a chosen design for business purpose.
This is exactly what I have observed too. Notice how all commercial software comes very polished and modern with a very superficial UI. Despite most software products being hundreds of times more complex than simple electronic gadgets, such gadgets often come with more complex manuals.
It's obvious that the true end-game of AI is not to make end users more productive, but rather de-skill end-users and make them entirely beholden to third-party software. Most companies would love the tradeoff of making their employees more like cogs at the cost of lower productivity.
Alas, I too am held hostage by Emacs as most software doesn't even come close to the flexibility it provides.
In the end is the same plot in every time, in 1894 an Italian Education Minister for SEVEN times, Guido Baccelli, stated "we need to teach just to read and write, be careful teaching history, we MUST set aside anti-dogmatism and critical thinking, the people MUST NOT THINK or we will be in trouble".
We see the same trend everywhere, in every profession as well, even in education https://www.theatlantic.com/ideas/archive/2020/08/i-was-usef... the issue is that such people are unable to innovate and smart and educated people are always scarce so if you do not cultivate knowledge you simply lost it, knowledge is the sole "natural resources" that grow with use instead of the contrary.
Idiot discovers that more "tools" make users less productive.