Apple debuts new catchall AI branding, generative features during WWDC 2024 keynote.
See full article...
See full article...
Are you talking about Apple Intelligence?I can't wait until AI goes the way of blockchain.
Actually, I thought that Apple was very targeted in how they are using AI - with the catch-all escape of explicitly using ChatGPT as an escape hatch if you want to do more…It feels Apple is approachin AI the same way Google does: stuffing ai gizmos everywhere. Let s see if there is some level of coherence into all that
Right, that's why they kept saying "Machine Learning" during the first half, that's what they've always called it. They had to pivot to saying "AI" to make investors happy, because those investors just want to hear buzzwords they think will earn more money.I wish I knew which Arsian said it first, but Apple has been using AI for a long, long time. Computational photography is AI. Siri is (bad) AI. OSX has had an AI text summarizer for a long time. Apple has just resisted making AI a prominent buzzword longer than everyone else.
LOL. I feel ya.I do not want a single person to send me an AI Generated Image. I will accept GenMojis.
Personally I think what Apple is doing here is a bit more useful than say what Microsoft is doing in the space. Like the keynote showed they are focusing on making it more useful to people as a product. Genmoji's are not my thing but I can see kids loving that. Some of the Siri features shown were actually useful imo.I can't wait until AI goes the way of blockchain.
Actually, I thought that Apple was very targeted in how they are using AI - with the catch-all escape of explicitly using ChatGPT as an escape hatch if you want to do more…
I love a spot of good trolling and goddamn this is some good trolling. Also, shout-out to the Arsian who anticipated that Apple would start using AI for "Apple Intelligence"...I'm a little surprised they did it, so good on you.This year, Apple figured out a new way to largely avoid the abbreviation "AI" by coining "Apple Intelligence," a catchall branding term that refers to a broad group of machine learning, LLM, and image generation technologies. By our count, the term "AI" only appeared once in the keynote: Near the end of the presentation, Apple executive Craig Federighi said, "It's AI for the rest of us."
I remember a tool called simply “101” at one point that was a killer text summarizer at Apple. I think it was based on some of the indexing stuff that later went into the OS, but was originally destined for Copland.I wish I knew which Arsian said it first, but Apple has been using AI for a long, long time. Computational photography is AI. Siri is (bad) AI. OSX has had an AI text summarizer for a long time. Apple has just resisted making AI a prominent buzzword longer than everyone else.
Oof...so it'll work on my new iPad but not my 13PM. At least it'll work on my M1 MBA. I guess once I get my iPad paid off I might have to start thinking about a 17PM or 17 "slim" or whatever.iPhone 15 Pro (Max) only (and M1 and above on the iPad and Mac). Ouch. I may start to feel like my 12 is a bit long in the tooth, maybe, depending on how well this works in practice. The contextual surfacing and transcribing of information does seem useful. I’m less keen on the generative aspects.
AI - if we so call LLMs and machine learning - won't go anywhere. Blockchains are still in use too. But it'll cease to be an overhyped stock pump and it'll start to find its natural uses in products and workflows for which it is actually a value added. Again, like blockchain and its predecessors in the great circle of tech hype.I can't wait until AI goes the way of blockchain.
Indeed. "AI" and machine learning have been around forever, but it looks like the MBAs have decided for us that "AI" now exclusively means "post-Transformer AI".I wish I knew which Arsian said it first, but Apple has been using AI for a long, long time. Computational photography is AI. Siri is (bad) AI. OSX has had an AI text summarizer for a long time. Apple has just resisted making AI a prominent buzzword longer than everyone else.
Are Blockchains still in use? Honestly? I sort of mean that seriously. I have never come across a use case for blockchains. I’ve heard of theoretical use cases that solve problems that people don’t face - the trust issue for instance.AI - if we so call LLMs and machine learning - won't go anywhere. Blockchains are still in use too. But it'll cease to be an overhyped stock pump and it'll start to find its natural uses in products and workflows for which it is actually a value added. Again, like blockchain and its predecessors in the great circle of tech hype.
And then they doubled down on AI buzzword. They should have gotten really scared about their ML miscalculation.
I think it is going to outlast you by many orders of magnitude. In terms of capabilities and usefulness, generative AI is just barely hitting 1910 airplane levels of usefulness. The scope, capability, reliability, and accuracy are all going to rise qualitatively in each of the next 5 year periods.I can't wait until AI goes the way of blockchain.
By our count, the term "AI" only appeared once in the keynote: Near the end of the presentation, Apple executive Craig Federighi said, "It's AI for the rest of us."
You should not have to hand over all the details of your life to be warehoused and analyzed in someone's AI cloud.
This sets a brand-new standard for privacy in AI, and unlocks intelligence you can trust.
And we're starting out with the best of these, the pioneer and market leader ChatGPT from OpenAI, powered by GPT-4o.
We also intend to add support for other AI models in the future.
This is AI for the rest of us, personal intelligence you can rely on at work, home, and everywhere in between.
Crypyo is still a thing. Everytime you buy or sell or send crypto, that gets added to the ledger. Now, that technology is (and should have been) obviously more limited in its usefulness and scope than an LLM, but wildly overhyping something to pump a stock and hopefully leave someone else holding the bag is a tech industry tradition.Are Blockchains still in use? Honestly? I sort of mean that seriously. I have never come across a use case for blockchains. I’ve heard of theoretical use cases that solve problems that people don’t face - the trust issue for instance.
Whereas there are clear use cases today for LLMs - and when the gimmick wears off and we come down to solid use cases I don’t expect LLMs to have that same level of incredulity when someone tells me they’re still in use.
I'd be happy if Siri on a home pod could use a playlist from the iPhone in my pocket instead of whining about Apple Music..Until this point I have not seen a Generative AI application that made me think the technology was necessarily worth it. Chatbots did nothing for me.
Seeing a tightly integrated semantic search for photos, seeing summarisation of messages, seeing all that packaged up and available via the UI I already use. That might have sold me a bit.
I wonder if in this new Siri I can ask it to play my playlist on Spotify. Or do we have Apple Intelligence meeting it's natural enemy Monopolistic Practices.
I think this is because Apple has generally earned trust in user facing features, where Microsoft’s user facing features tend to feel like catch-up or used solely for marketing. I’m not making a judgement on whether Windows or macOS is better, but new Windows features tend to be meh.I love a spot of good trolling and goddamn this is some good trolling. Also, shout-out to the Arsian who anticipated that Apple would start using AI for "Apple Intelligence"...I'm a little surprised they did it, so good on you.
It definitely is a bit of cognitive dissonance that I firmly feel that Alphabet AI, Meta AI and Microsoft AI are all probably irredeemably awful, and yet, when Apple talks about AI, I think, "Well, let's see what they manage to do with it." Then again, it is, I think, fair to extend a certain amount of credit to companies who have a habit of doing more than just giving lip service to their individual users' privacy. Anyway, all that is to say that I'm curious to see where Apple takes all of this.