WOW! Glasses, air-pods and your phone as the processing brain connection between the two devices would be AMAZING! I have never thought about that. For sure seems like an Apple idea.
I think it does matter why the stock performed so well on Tuesday and Wednesday. If the move was driven by large volume of buybacks, then it shouldn’t be messaged as enthusiasm for the new tech. But of course, that’s exactly how most will message it.
Usually as investors we view increases in revenue as the key, and it is important. But at the same time Apple has 3 huge advantages with regard to the cost side of AI. 1. Apple does have a 3 billion parameter SLM, but there are indications it performs like a larger SLM with 8 billion parameters. 2. All companies in the industry have to invest in very expensive Nvidia servers. At the same time the overwhelming number of AI requests for Apple will be performed by the processor that the customer has purchased. 3. In Apple's own private cloud they are able to run larger language models with 2 big cost savings. First is those data centers run on renewable energy vs. paying electric utilities who will be gouging the tech industry with high prices. Second it runs on Apple's own silicon which is a much lower cost to Apple than Nvidia servers.
If we are going to have our own personal LLMs the way Apple is showcasing then we will need to have all that data synced on iCloud. Everyone will want to upgrade their iCloud accounts.
WOW! Glasses, air-pods and your phone as the processing brain connection between the two devices would be AMAZING! I have never thought about that. For sure seems like an Apple idea.
I think it does matter why the stock performed so well on Tuesday and Wednesday. If the move was driven by large volume of buybacks, then it shouldn’t be messaged as enthusiasm for the new tech. But of course, that’s exactly how most will message it.
Usually as investors we view increases in revenue as the key, and it is important. But at the same time Apple has 3 huge advantages with regard to the cost side of AI.
1. Apple does have a 3 billion parameter SLM, but there are indications it performs like a larger SLM with 8 billion parameters.
2. All companies in the industry have to invest in very expensive Nvidia servers. At the same time the overwhelming number of AI requests for Apple will be performed by the processor that the customer has purchased.
3. In Apple's own private cloud they are able to run larger language models with 2 big cost savings. First is those data centers run on renewable energy vs. paying electric utilities who will be gouging the tech industry with high prices. Second it runs on Apple's own silicon which is a much lower cost to Apple than Nvidia servers.
If we are going to have our own personal LLMs the way Apple is showcasing then we will need to have all that data synced on iCloud. Everyone will want to upgrade their iCloud accounts.
How does Musk deserve a pay package equivalent to Tesla!s total profits over the last 3 years yet the stock has done nothing during that time?