If you were watching the State of the Union address, and you’re an iPhone user, then toward the end of the speech, during President Trump’s recounting of the story of Chief Warrant Officer Eric Slover, you might have had Siri triggered—assuming you have voice activation turned on.
Gizmodo’s own Matt Novak brought it to my attention:
At least one other Bluesky user confirmed that she experienced the same thing. A user on X said the erroneous Siri trigger word was “serious” not “searing,” but the timing of the post suggests it was the same moment.
Another Bluesky user (whose posts are off-limits to those who are not logged into Bluesky), posted a Google results page Siri pulled up following the Siri-triggering line, featuring a bunch a results about bullets going through legs.
Trump’s recounting of Slover’s harrowing story very much did include such gory details:
While preparing to land, enemy machine guns fired from every angle and Eric was hit very badly in the leg and hip, one bullet after another. He absorbed four agonizing shots, shredding his leg into numerous pieces. And yet, despite the fact that the use of his legs was vital to a successful helicopter flight — legs are the most important part of flying a helicopter — to deliver the many commandos who would capture and detain Maduro was the only thing Eric was thinking about.
Exactly which word or words woke up Siri—”searing” or “serious” or perhaps some part of “…was hit very…”—are not yet totally clear, but it clearly happened around this moment.
But if your Siri was triggered, I hope it spiced up an otherwise dire night of politics. That speech was rough, folks!
Apple Music has a new Playlist Playground feature that lets users create a playlist with a text-based prompt
Apple is preparing to release iOS 26.4, and while the early beta stages suggest it will be a feature-packed update for the iPhone, one major piece of the puzzle remains conspicuously absent.
Despite rumours that this version would finally see the debut of the long-awaited, Gemini-powered “Siri 2.0,” the latest build seems to focus more on refining media, security and messaging.
Here are the five most significant upgrades coming to your iPhone with iOS 26.4.
1. Apple Music: “Playlist Playground” AI
The standout feature for many will be “Playlist Playground,” an AI-driven tool for Apple Music. This allows you to generate a 25-song playlist simply by typing a text prompt – for example, “chill acoustic tracks for a rainy afternoon” or “90s grunge hits for the gym.” What’s more, the Music app is also getting a visual refresh with full-page artwork for albums and a new “Concerts Near You” section to help you find live tour dates for your favourite artists.
2. Enhanced video capabilities for podcasts
Apple is significantly upgrading its Podcasts app to bridge the gap between audio and video content. Using new HTTP Live Streaming (HLS) technology, the app will now allow you to switch seamlessly between watching a video podcast and listening to the audio version without losing your place. The quality will also adjust automatically based on your network conditions, ensuring a smoother experience for those on the move.
3. End-to-End RCS encryption
Following the adoption of RCS (Rich Communication Services) in previous updates, Apple is finally testing end-to-end encryption for these messages. While currently limited to testing between Apple devices in the beta, this is a critical security step. It aims to make messaging with Android users as secure as iMessage by ensuring that texts, photos, and videos cannot be intercepted or read by third parties while in transit.
4. Stolen device protection by default
In a move to bolster user security, iOS 26.4 is expected to enable Stolen Device Protection by default for all users. Previously an opt-in setting, this feature requires biometric authentication (Face ID or Touch ID) for sensitive actions, such as changing your Apple ID password, when you are away from familiar locations like your home or office. By making this a standard setting, Apple is making it significantly harder for thieves to lock owners out of their own accounts.
5. Smart “Urgent” reminders
Building on the task prioritization introduced in iOS 26.2, the Reminders app is gaining a dedicated “Urgent” smart list. When you tag a task as urgent, it will automatically populate this list and trigger persistent alarms to ensure high-priority items aren’t buried under your daily chores. This update also adds a new “Set Battery Charge Limit” action to the Shortcuts app, allowing you to manage your device’s long-term battery health more easily.
Conclusion: The Siri-shaped hole
While iOS 26.4 brings a wealth of improvements and security patches, the absence of the smarter, more personal Siri is hard to ignore. Initial reports suggested that the AI assistant – capable of understanding on-screen context and cross-app interactions – would arrive in this cycle. However, current leaks now suggest that Apple may hold back the full Siri overhaul until iOS 27 in late 2026.
Mark Gurman, Bloomberg’s Apple scoops guy, says the development of the latest version of Siri is not looking good in tests. It’s apparently going badly enough that Apple will release only a partial version when the updated voice assistant debuts in the next version of iOS. To be clear, the iOS 26.4 update is still expected to arrive next month, and it’s still expected to have a new version of Siri, but it may be a bit of a letdown.
That’s not good for Apple. Perhaps you’ll recall that Apple has been advertising a version of Siri that works as a smart, seamless, automated personal assistant in your pocket for a long time. Apple even made a commercial about this with Bella Ramsey released in fall of 2024:
But that ad had to be pulled because Apple couldn’t ship a real-life version of what it depicted. Asking Siri questions as if it’s a chatbot and then getting good answers drawn from your information across multiple apps is a function that certainly feels possible based on existing technology. But it’s now 2026 and Apple still hasn’t released that version of Siri.
And as I wrote late last month, Apple is perceived as needing to notch a win in the AI area after falling way behind Google in AI authority. The AI model driving the new, still unreleased, Siri is essentially rented from Google for $1 billion per year. And who knows, perhaps Google’s model is the culprit behind the latest problems with Siri, but it’s hard to picture consumers blaming Google if Apple can’t execute a solid new Siri product.
Gurman’s sources tell him tests of the new Siri found that it processes queries incorrectly, and that it sometimes takes “too long”—too long for what? We don’t get to know, but it’s clearly slow. Gurman points to the feature from the Bella Ramsey ad in which the AI mines answers from your personal data, and answers questions like “What was that Greek restaurant Larry told me to try?” as one likely to be delayed past iOS 26.4.
If it’s iOS 26.5 that eventually gets the Bella Ramsey version of Siri, and the user interface ends up being designed like the working version of that operating system that Apple employees are using to perform tests, Gurman says there may be an optional toggle allowing the user to “preview” that new Siri version, meaning it’ll be framed as something that the user can try at their own peril.
So ostensibly, these Siri features aren’t being cancelled or eliminated, but delayed. Apple will, Gurman says, release some sort of partial Siri update in March with iOS 26.4, and then the rest of the new Siri features will be sprinkled into the 26.5 update in May, and the larger update to iOS 27 in September, when the iPhone 18 line is scheduled to roll out. Though this “remains a fluid situation, and Apple’s plans may change further,” Gurman writes.
Apparently, according to Gurman, another delayed feature will be Siri-based voice controls for “App Intents,” a new framework for controlling apps that Apple says will perform an “increasingly critical role within Apple’s developer platforms.” This delay may not be grieved by developers, who, judging from X posts, don’t seem super eager to figure out how to use it.
Ever felt like your phone is eavesdropping on your conversations? You mention a new pair of hiking boots to a friend, and miraculously, your Instagram feed is awash with walking clobber.
While tech companies often claim they don’t “listen” in the traditional sense, they do use “passive listening” for wake words as well as massive amounts of behavioural data to predict your interests.
If you want to reclaim your privacy, here is a short guide on how to shut down the digital ears of your smartphone right now.
1. Disable Your “Virtual Assistant”
The primary way your microphone stays “active” is to listen for wake words like “Hey Siri” or “Hey Google.” While these are meant to be helpful, they mean your microphone is technically always on.
For iPhone: Go to Settings > Siri & Search. Toggle off “Listen for ‘Hey Siri’” and “Press Side Button for Siri.”
For Android: Open the Google App, tap your profile icon, and go to Settings > Google Assistant > Hey Google & Voice Match. Toggle it off.
2. Audit Your App Permissions
Many apps request microphone access during installation for no logical reason. Why does a calculator or a photo editor need to hear you?
For iPhone: Go to Settings > Privacy & Security > Microphone. You will see a list of every app with mic access. Toggle off anything that doesn’t strictly need it (including social media apps if you don’t record stories).
For Android: Go to Settings > Apps > See all apps. Select an app, tap Permissions, then Microphone, and select “Don’t allow.”
3. Kill “Personalized Advertising”
Even if the mic is off, apps track your “cross-contextual” behaviour – in other words, they follow you from one app to another to build a profile of your life.
For iPhone: Go to Settings > Privacy & Security > Tracking and turn off “Allow Apps to Request to Track.” Then go to Apple Advertising at the bottom of the Privacy menu and turn off “Personalized Ads.”
For Android: Go to Settings > Google > Ads and tap “Delete Advertising ID.” This resets the unique string of numbers used by marketers to identify you.
The orange dot on an iPhone screen means the mic is in use
4. Watch for the “Warning Lights”
Modern smartphones have built-in physical indicators to tell you when a hardware component is active.
iPhone users: Look for a small orange dot in the top right corner of your screen. If you see it and you aren’t on a call or recording a memo, an app is actively using your microphone.
Android users: On newer versions (Android 12+), a green microphone icon or dot appears in the status bar when the mic is being accessed.
5. Clear Your Voice History
Big Tech keeps a “memory bank” of your previous voice requests to “improve their service.” You should purge this regularly.
Google: Visit myactivity.google.com,click on “Web & App Activity,” and find the section for “Voice & Audio Activity” to delete your recordings.
Apple: Go to Settings > Siri & Search > Siri & Dictation History and tap “Delete Siri & Dictation History.”
6. The “Hardware” Approach
If you want to go full paranoid mode, consider physical barriers. Some privacy-conscious users use “microphone blockers” – small plugs that go into the headphone jack or charging port to trick the phone into thinking an external mic is plugged in.
Alternatively, keep your phone in another room or inside a “Faraday bag” during sensitive private conversations.
By following these steps, you move from being a passive data point to an empowered consumer, ensuring that your private conversations stay exactly that.
Sirius XM (NASDAQ:SIRI – Free Report) had its target price raised by Rosenblatt Securities from $23.00 to $24.00 in a report released on Friday morning,Benzinga reports. They currently have a neutral rating on the stock.
SIRI has been the subject of several other research reports. Moffett Nathanson assumed coverage on shares of Sirius XM in a research note on Tuesday, January 27th. They issued a “neutral” rating and a $21.00 target price for the company. Weiss Ratings restated a “hold (c-)” rating on shares of Sirius XM in a report on Monday, December 29th. Benchmark reiterated a “buy” rating and issued a $30.00 price objective (up previously from $28.00) on shares of Sirius XM in a research note on Friday, October 31st. JPMorgan Chase & Co. lifted their target price on shares of Sirius XM from $19.00 to $20.00 and gave the stock an “underweight” rating in a research report on Friday, October 31st. Finally, Barrington Research restated an “outperform” rating and issued a $28.00 target price on shares of Sirius XM in a research note on Thursday. Three investment analysts have rated the stock with a Buy rating, three have assigned a Hold rating and four have assigned a Sell rating to the company. According to MarketBeat.com, the stock currently has a consensus rating of “Reduce” and a consensus price target of $24.00.
Shares of NASDAQ:SIRI opened at $21.68 on Friday. The business’s 50 day moving average is $20.95 and its 200 day moving average is $21.83. Sirius XM has a 12-month low of $18.69 and a 12-month high of $27.41. The stock has a market cap of $7.30 billion, a price-to-earnings ratio of 9.68, a PEG ratio of 0.29 and a beta of 0.93. The company has a quick ratio of 0.30, a current ratio of 0.30 and a debt-to-equity ratio of 0.75.
Sirius XM Announces Dividend
The company also recently declared a quarterly dividend, which will be paid on Friday, February 27th. Stockholders of record on Wednesday, February 11th will be given a dividend of $0.27 per share. This represents a $1.08 dividend on an annualized basis and a dividend yield of 5.0%. The ex-dividend date of this dividend is Wednesday, February 11th. Sirius XM’s dividend payout ratio is currently 48.21%.
Institutional Investors Weigh In On Sirius XM
A number of hedge funds and other institutional investors have recently added to or reduced their stakes in the company. Brighton Jones LLC bought a new position in Sirius XM in the 4th quarter worth about $622,000. Hsbc Holdings PLC lifted its holdings in shares of Sirius XM by 779.4% in the second quarter. Hsbc Holdings PLC now owns 115,237 shares of the company’s stock worth $2,639,000 after buying an additional 102,133 shares in the last quarter. SG Americas Securities LLC grew its stake in Sirius XM by 292.5% during the third quarter. SG Americas Securities LLC now owns 113,728 shares of the company’s stock valued at $2,647,000 after acquiring an additional 84,751 shares in the last quarter. Tweedy Browne Co LLC bought a new stake in Sirius XM during the 2nd quarter worth approximately $289,000. Finally, CWM LLC increased its holdings in shares of Sirius XM by 126.3% in the second quarter. CWM LLC now owns 120,436 shares of the company’s stock worth $2,766,000 after purchasing an additional 67,222 shares during the period. 10.69% of the stock is currently owned by institutional investors.
Trending Headlines about Sirius XM
Here are the key news stories impacting Sirius XM this week:
Positive Sentiment: Management reiterated strong free-cash-flow targets and outlined cost savings (>$250M realized, $100M more targeted) that support 2026–2027 FCF growth; analysts and investors are highlighting a resilient FCF profile as a reason to own the stock. Sirius XM: Cash Flow Resilience Is Underappreciated
Positive Sentiment: Barrington Research reaffirmed an “outperform” rating and $28 price target, signaling upside potential versus current levels and supporting buy-side interest. Barrington Research coverage
Neutral Sentiment: Q4 revenue roughly matched/beat estimates while reported GAAP EPS figures were reported differently across outlets (some beats cited, some misses listed); overall results were viewed as mixed and generated volatility around the print. SiriusXM Reports Fourth Quarter and Full-Year 2025 Results
Neutral Sentiment: Unusual options activity: heavy call buying today (≈56,883 calls, ~264% above normal), which could reflect speculative bets on a rebound or hedged institutional positioning; this increases intraday volume/volatility but is not a fundamentals change.
Neutral Sentiment: Rosenblatt bumped its price target to $24 but kept a “neutral” rating — a modest positive to sentiment but not a strong endorsement. Rosenblatt update
Negative Sentiment: Company reported a full-year 2025 loss of ~301k self-pay subscribers, a clear headwind to top-line momentum and a key reason investors are trimming positions. Sirius XM Stock Is Sliding Friday: What’s Going On?
Negative Sentiment: FY2026 revenue guidance was shown as roughly in line to slightly below consensus, and EPS guidance was unclear, leaving some investors concerned about near-term growth visibility. MarketWatch: Sirius XM guidance
Negative Sentiment: Sirius XM agreed to a ~$28M settlement related to alleged telemarketing practices — a headline hit and a small one-time cash outflow that adds to near-term noise. SiriusXM agrees to $28M settlement
Sirius XM Holdings Inc is a leading audio entertainment company specializing in subscription-based satellite and streaming radio services. Formed in 2008 through the merger of Sirius Satellite Radio and XM Satellite Radio, the company delivers a broad range of programming across music, sports, news, talk and comedy channels. Sirius XM’s offerings include exclusive live sports play-by-play, artist-curated music channels, news coverage from major networks and original talk and entertainment series.
Headquartered in New York City, Sirius XM serves listeners throughout the United States and Canada, reaching tens of millions of subscribers.
Further Reading
Receive News & Ratings for Sirius XM Daily – Enter your email address below to receive a concise daily summary of the latest news and analysts’ ratings for Sirius XM and related companies with MarketBeat.com’s FREE daily email newsletter.
The UK is losing more jobs than it is creating because of artificial intelligence and is being hit harder than rival large economies, new research suggests. British companies reported that AI had resulted in net job losses over the past 12 months, down 8% – the highest rate among other leading economies including the US, Japan, Germany and Australia, according to a study by the investment bank Morgan Stanley. The research, which was shared with Bloomberg, surveyed companies using AI for at least a year across five industries: consumer staples and retail, real estate, transport, healthcare equipment and cars. Guardian
Apple is planning to unveil its newly revamped Siri assistant at an event next month, according to a report. The latest version of Apple’s digital assistant will be powered by Google’s market-leading Gemini AI model following a recently announced partnership between the two US tech giants. The long-overdue upgrade to Siri, which launched as Apple’s proprietary voice assistant on the iPhone in 2011, will arrive with iOS 26.4, according to Bloomberg. Beta testing is expected to begin in the second half of February before a public rollout in March or April. Independent
One of them is an “idiot”. The other is running a “cesspit”. Even for connoisseurs of corporate spats, the war of words that broke out this week between the world’s richest man Elon Musk and Ryanair’s Michael O’Leary has turned into a classic of the genre. The two men have been tearing lumps out of each other for the last few days, and the argument could even turn into a full-scale takeover of the airline. And yet, one point is surely clear. Sure, Musk has plenty to boast about. But so far he is no match for the pugnacious O’Leary – and right now he just looks envious of his wittier rival. Telegraph
You may well have noticed issues with the automatic filters and spam scanning in your Gmail inbox over the weekend: these are issues that Google has officially acknowledged, and a fix should now be making its way out to users. As per the Google Workspace Status Dashboard (via Engadget), numerous issues affected users of Google’s email app across the course of Saturday. These issues included “misclassification of emails” via Gmail’s built-in automatic filtering. Tech Radar
In certain corners of the internet, on niche news feeds and algorithms, an AI-generated British schoolgirl has emerged as something of a phenomenon. Her name is Amelia, a purple-haired “goth girl” who proudly carries a mini union flag and appears to have a penchant for racism. If you are unfamiliar with Amelia, the chances are you will soon encounter one viral meme or another inspired by her on Facebook or X, where her reputation is growing. Guardian
Ofcom is formally investigating whether Meta complied with legally binding information requests regarding WhatsApp’s role in the UK business messaging ecosystem. The case, published on Ofcom’s own enforcement register on Friday, centers on two statutory “section 135” notices issued to Meta on July 31, 2024, and June 19, 2025, under the Communications Act 2003. Those notices required Meta to hand over data on how WhatsApp Business competes in the application-to-person messaging market – the unglamorous stuff companies use to ping customers about parcels, appointments, and login codes. The Register
Apple has officially joined forces with Google to use its Gemini AI models as the foundation for a massive Siri overhaul – a move that confirms the iPhone maker is looking externally to accelerate its lagging artificial intelligence strategy.
The multi-year collaboration, which has just been announced, will see Google’s Gemini 3 technology integrated into future Apple Foundation Models.
This partnership marks a pragmatically significant shift for Apple, which has historically prided itself on developing every layer of its technology in-house. Reports suggest the deal is worth approximately $1 billion annually, positioning Google as the primary engine behind the “more personalized” Siri expected to debut later this year.
The primary reason for this alliance is Apple’s need to catch up. Despite marketing “Apple Intelligence” heavily over the last two years, the company has faced significant development delays, pushing the full Siri revamp into 2026. Internal performance testing reportedly determined that Google’s Gemini offered a more capable and scalable foundation than Apple’s own early models.
By leveraging Google’s infrastructure, Apple can quickly introduce features that its rivals, such as Samsung and Google’s own Pixel line, already offer. This includes Siri’s ability to understand on-screen content, manage complex multi-step tasks across different apps and utilize personal context from emails and messages to provide more relevant assistance.
Performance and privacy gains
The advantages for Apple users are expected to be substantial. The next generation of Siri will transition from a basic command-response assistant to a proactive agent capable of natural dialogue. Because the Gemini 1.2 trillion parameter model is far larger than anything Apple currently runs, Siri should become significantly more accurate and versatile.
Apple has also taken steps to mitigate its biggest brand risk: privacy. To maintain its strict privacy standards, the companies confirmed that these AI features will run on Apple’s own devices and its “Private Cloud Compute” system.
This means that while Google provides the “brains” or the underlying logic, the actual processing of sensitive user data remains within Apple-controlled environments, theoretically preventing Google from accessing personal user information.
Market risks and regulatory hurdles
However, the partnership carries considerable strategic and legal risks. By outsourcing the foundational layer of its AI, Apple risks becoming dependent on a direct competitor. Analysts warn that this could lead to “brand dilution,” where the iPhone’s unique edge is eroded because its core intelligence is identical to that of Android devices.
The deal has also immediately caught the attention of global regulators. Coming on the heels of major antitrust rulings against Google’s search monopoly, this new alliance, which creates an “AI duopoly”, is being closely monitored by the UK’s Competition and Markets Authority and EU policymakers.
Critics, including Elon Musk, have already slammed the move as an “unreasonable concentration of power” that could further stifle competition in the rapidly evolving AI landscape.
It’s been a bad month for AI gadgets, and that’s saying a lot. As if public opinion wasn’t already in the gutter after the collective letdown of Humane and its now-defunct Ai Pin, as well as its less expensive counterpart from Rabbit, the R1, reports have begun circulating that Sam Altman and Jony Ive, via their joint venture, IO, are also struggling to devise AI hardware that, ya know… works. But the fact that things are bad doesn’t mean they can’t get worse, and if new reports about Apple’s AI efforts are any indication, they just might.
According to a report from Bloomberg’s Mark Gurman, Apple is running into a few hiccups with its rumored robotic smart home hub, and I’ll give you one guess what one of those issues is. Per Bloomberg:
“The motor system has had engineering challenges, and the company has sought to find compelling AI uses for the device. That’s pushed out the current timeline to roughly two years from now.”
Mechanical challenges are one thing (that feels like a technical hurdle that Apple can throw money at and eventually resolve), but the AI part isn’t so simple. Devising useful AI features isn’t just a money or engineering problem; it’s a philosophical one. Before Apple figures out where to put its resources, it has to have a reason to devote those resources. And if the broader field of AI gadgets is any indication, those reasons haven’t been exactly forthcoming.
While there are plenty of AI features in existence right now (i.e., everything new and Gemini-related on Google’s Pixel devices), not many of them have resonated with consumers who either don’t know that they exist or haven’t been given a compelling reason to use them. If you can’t convince people to use AI features on a device that they have in their hands almost every second of every day, it’ll be even harder to convince them to use AI features on a device they have almost no frame of reference for, such as, I don’t know, a robotic smart home display.
Not to mention, everything is compounded by the next-gen Siri of it all. If there’s one place you want a next-gen voice assistant, it’s a smart home hub, but Apple has (somewhat infamously) struggled to deliver its promised chatbot-infused Siri, or at least one that works the way it should. Those struggles, by the way, are still firmly ongoing. And the rest of its Apple Intelligence features, while not quite as damaging to public perception, haven’t exactly blossomed yet, especially notification summaries, which Apple had to put on pause briefly, given the fact they, uh… were a bit of a disaster.
Maybe Apple will figure things out. The company doesn’t plan to release its arm-having robot for another two years, according to Bloomberg, and a lot could happen between now and then, but it’s hard to be optimistic with the way things are going. Per the Financial Times, Jony Ive and Sam Altman have struggled with just about every piece of their AI gadget (a palm-sized device that you can bring on the go), including how its voice assistant works and even how to get enough compute to power it via the cloud. Woof.
Clearly, there’s still a lot of work to be done before AI gadgets can be useful in the way that even the biggest tech companies are still trying to get them to be, but if there’s one company that could figure it out, Apple would be it. And if Apple can’t get the job done? Well, I’ve got bad news for Altman, Ive, and company.
This week, OpenAI announced that apps can now run directly inside ChatGPT, letting users book travel, create playlists, and edit designs without switching between different apps. Some immediately declared it the app platform of the future — predicting a ChatGPT-powered world where Apple’s App Store becomes obsolete.
An open question was answered today – “what will the AI-native distribution channel be?”
It looks like ChatGPT will be that channel with 800M active users + the Apps SDK.
This is likely as important as Steve Jobs announcing the app store in March of 2008 … pic.twitter.com/6RCbIi0foq
But while OpenAI’s app platform presents an emerging threat, Apple’s vision for an improved Siri — though still seriously delayed — could still play out in its favor.
After all, Apple already controls the hardware, the operating system, and has roughly 1.5 billion iPhone users globally, compared to ChatGPT’s 800 million weekly active users. If Apple’s bet pays off, it could position the iPhone maker in a way that would not only maintain its app industry dominance but also modernize how we use apps in the AI era.
Apple’s plan is to kill the app icon without killing the app itself. Its vision for AI-powered computing — introduced at its developer conference last year — would see iPhone users interact with an overhauled version of Siri and a revamped system that changes the way you use apps on your phone. (Imagine less tapping and more talking.)
Apps are passé, long live apps?
It’s an idea whose time has come.
Organizing little tappable icons on your iPhone’s Home Screen to make online information more accessible is a dated metaphor for computing. Meant to resemble a scaled-down version of a computer’s desktop, apps are becoming a less common way for users to interact with many of their preferred online services.
These days, consumers are just as likely to ask an AI assistant for a recommendation or insight as they are to do a Google search or launch a dedicated, single-purpose app, like Yelp. They’ll talk out loud to their smart speakers or Bluetooth-connected AirPods to play their favorite tunes; they’ll ask a chatbot for business information or a summary of reviews for a new movie or show.
Techcrunch event
San Francisco | October 27-29, 2025
The AI, a large language model trained on web-scraped data and more, determines what the user wants to know and spits out a response.
This is arguably easier than scouring through Google’s search results for the right link with the answer. (That’s something Google itself realized over a decade ago, when it startedputtinganswers to user queries right on the search results page.)
AI is also often easier than finding the right app on your now overcrowded iPhone, launching it, and then interacting with its user interface — which varies from app to app — to perform your task or get an answer to your question.
However, ChatGPT’s app system, while seemingly improving on this model, remains locked inside the ChatGPT user experience. It requires consumers to engage in a chatbot-style interface to use their apps, which could require user education. To call up an app, you have to name it as the first word of your prompt or otherwise mention the app by name to get a button that prompts you to “use the app for the answer.” Then, you have to type in an accurate query. (If you mess this up, early tests by Bloomberg indicate you could get stuck on a loading screen with no results!)
We have to wonder: is this the future of apps, or just the future while there’s no other competition? When another solution becomes available — one that’s built into your iPhone, no less — will consumers keep using ChatGPT, or are they still willing to give Siri another try? We don’t know, but we wouldn’t count out Apple yet, even though Siri has quite a badreputation to salvage at this point.
Siri may be an embarrassment as it stands today, but Apple’s overall ecosystem has advantages. For starters, consumers already have the apps they want to use on their phone or know how to find them on the App Store, if not. They’ve used many of these apps for years. Muscle memory goes a long way!
Meanwhile, there are a few roadblocks to getting started with ChatGPT’s app platform.
You have to install the app in question, of course; then you have to connect the app to ChatGPT by jumping through a warning-filled permission screen. This process requires you to authenticate with the app using your existing username and password, and to enter the two-factor authentication code, if applicable.
After this one-time setup, things should be easier. For instance, after you generate a Spotify playlist with AI, it can be launched in the Spotify app with a tap.
However, this experience won’t differ much from Apple’s plans if Apple is able to make things work as promised. Apple says you’ll be able to talk or text Siri to control your apps.
There are other disadvantages to the OpenAI app model. You can only interact with one app at a time, instead of being able to switch back and forth between apps — something that could be useful when comparing prices or trying to decide between a hotel room and an Airbnb.
Using apps within ChatGPT also strips away the branding, design, and identity that consumers associate with their favorite apps. (For those who hate how cluttered Spotify’s app has become, perhaps that’s a good thing. Others, however, will disagree.) And, in some cases, using the mobile app version to accomplish your goals may still be easier than using the ChatGPT app version because of the flexibility the former offers.
Finally, compelling users to switch app platforms could be difficult when there isn’t an obvious advantage to using apps within ChatGPT — except for the fact that it’s neat that you can.
Can Apple save Siri’s reputation with AI features?
In its WWDC 2024 demonstration — which Apple swears was not “demoware” — the company showed how the apps would function under this new system and how they could use other AI features like proofreading.
Most importantly, Apple told developers that they’ll be able to take advantage of some of its AI capabilities without having to do additional work — like a note-taking app using proofreading or rewriting tools. Plus, developers who have already integrated SiriKit into their apps will be able to do more in terms of having users take action in their apps. (SiriKit, a toolkit for making apps interoperable with Siri and Apple’s Shortcuts, is something developers have been using since iOS 10.)
These developers will see immediate enhancements when the new Siri rolls out.
Image Credits:Apple
Apple said it will focus on categories like Notes, Media, Messaging, Payments, Restaurant Reservations, VoIP Calling, and Workouts, to start.
Apps in these categories will be able to let their users take actions via Siri. In practice, that means Siri will be able to invoke any item from an app’s menus. For example, you could ask Siri to see your presenter notes in a slide deck, and your productivity app would respond accordingly.
The apps would also be able to access any text displayed on the page using Apple’s standard text systems. That could make the app interactions feel more natural, without the user having to give specifically worded prompts or commands. For instance, if you had a reminder to wish your grandpa a happy birthday, you could say “FaceTime him” to take that action.
Image Credits:Apple
Apple’s existing Intents framework is also being updated to gain access to Apple Intelligence, covering even more apps in categories like Books, Browsers, Cameras, Document Readers, File Management, Journals, Mail, Photos, Presentations, Spreadsheets, Whiteboards, and Word Processors. Here, Apple is creating new “Intents” that are pre-defined, trained, and tested, and making them available to developers.
That means you could tell the photo-editing app Darkroom to apply a cinematic filter to an image via Siri. Plus, Siri will be able to suggest an app’s actions, helping iPhone users discover what their apps can do and take those actions.
Developers have been adopting the App Intents framework, introduced in iOS 16, because it offers other functionality to integrate their app’s actions and content with other platform features, including Spotlight, Siri, the iPhone’s Action button, widgets, controls, and visual search features — not just Apple Intelligence.
Image Credits:Apple
Also, unlike ChatGPT, Apple runs its own operating system on its own hardware and offers the App Store as a discovery mechanism, the app infrastructure, and developer tools, APIs, and frameworks — not just the AI-powered interface that will help you use your apps.
Though Apple may have to borrow some AI tech from others to do that last bit, it has the data to personalize your app recommendations, and, for the privacy-minded, the controls that let you limit how much information apps themselves can collect. (Where’s the “Do Not Track” option for ChatGPT’s app system, we wonder?)
OpenAI’s system doesn’t work out of the box with all your apps at launch. It requires developer adoption and relies on the Model Context Protocol (MCP), a newer technology for connecting AI assistants to other systems. That’s why ChatGPT currently works with only a handful of apps, like Booking.com, Expedia, Spotify, Figma, Coursera, Zillow, and Canva. MCP adoption is growing, but the delay in its becoming broadly adopted could give Apple the extra time it needs to catch up.
What’s more, word is that Apple’s AI system is nearly ready. The company is reportedly already internally testing this, allowing users to take actions in apps by using Siri voice commands. Bloomberg reported that this smarter version of Siri works out of the box works with many apps, including those from major players like Uber, AllTrails, Threads, Temu, Amazon, YouTube, Facebook, and WhatsApp. And it’s still on track to ship next year, Apple confirmed to TechCrunch.
Apple has an iPhone, OpenAI has Jony Ive
The iPhone’s status as an app platform will also be difficult to disrupt, even from a company as large and powerful as OpenAI.
The ChatGPT maker understands this, too, which is why OpenAI is exploring its own device with Apple’s former head of design, Jony Ive. It wants its AI to become more of a part of consumers’ everyday lives and habits, which could require a hardware device.
Apple just finished its “awe-dropping” event, unveiling the iPhone 17 family—four distinct models that raise the bar in performance, display, and design. Each model offers a step up from the last. You can view the entire Apple unveiling video below.
Here’s what’s new:
iPhone 17 (Base Model)
The iPhone 17 introduces a larger 6.3-inch Super Retina XDR OLED display with ProMotion (1–120 Hz) and up to 3,000 nits brightness, ensuring crisp visuals and smooth scrolling. It runs on the new A19 chip built on 3 nm tech for noticeably faster, more efficient performance.
The rear camera system steps up with 48 MP Fusion Main and Ultra Wide lenses, plus a 2× optical telephoto, and features the Portrait-friendly Bright Photographic Style.
Love RADIO ONE EXCLUSIVE? Get more! Join the WERE-AM 1490 Newsletter
Notably, Apple debuted the first square 18 MP Center Stage front camera on an iPhone with 4K HDR stabilization and Dual Capture for front-and-rear recording. Storage starts at 256 GB.
iPhone 17 Air
Next is the ultra-slender iPhone Air—the thinnest iPhone yet at 5.6 mm—and crafted with a titanium frame.
It employs the A19 Pro chip, paired with a C1x modem, N1 wireless chip, and supports eSIM only. The 6.5-inch ProMotion display hits 3,000 nits and taps into iOS 26’s adaptive power mode for all-day battery life.
Cameras include dual 48 MP Fusion rear lenses, a 12 MP telephoto, and the 18 MP Center Stage selfie cam. It also packs Wi-Fi 7, Bluetooth 6, and ships in new colors starting at $999 for 256 GB.
iPhone 17 Pro & Pro Max
Finally, Apple pushed Pro performance further.
Both models run on A19 Pro and use an internal vapor-chamber cooling system within an aluminum unibody for enhanced thermal efficiency. They offer triple 48 MP Fusion cameras—including a new 8× zoom Telephoto sensor—and the same 18 MP Center Stage front cam.
Video creators gain Pro-level tools: ProRes RAW, Apple Log 2, Dolby Vision HDR, and Genlock support. Battery life sees a massive bump. The Pro line also embraces recycled materials and expanded storage options, up to 2 TB on the Pro Max.
Apple is planning to launch a new AI-powered web search tool for Siri next year, according to a new report from Bloomberg, as it seeks to compete with competitors who’ve invested heavily in AI. Details are still scarce and could change before launch, but it sounds like the whole thing could be powered by a custom version of Google’s Gemini.
The new system is being called World Knowledge Answers internally, according to Bloomberg, and may even be added to Safari and Spotlight. An AI-powered version of Siri has been long delayed, after Apple promised in 2024 that it would be available in June 2025. That, of course, was pushed back.
The new AI features for Siri will likely create a search experience on Apple devices that utilizes the unique access it has to things like text, photos, and videos. And it’s likely to create summaries based on web searches that are more powerful than what’s available with the currently anemic Siri.
But even if a custom-built Gemini is used for some functions like summarizing, it would probably run on Apple’s own Private Cloud Compute servers in order to maintain privacy, according to Bloomberg. Google has already reportedly delivered Gemini’s summarizing tech to Apple, but it’s still being fine-tuned. Apple previously considered buying Perplexity but is no longer interested, according to the news outlet.
Privacy has been a tricky problem to solve when tech companies tackle AI. OpenAI’s Sam Altman has warned that anyone using ChatGPT as a therapist should know that there are no doctor-patient confidentiality laws for AI chatbots. And Signal’s Meredith Whittaker has warned that agentic AI capabilities are extremely difficult to pull off in an encrypted way.
Apple has gotten some heat from investors for seemingly slipping behind other startups in implementing AI. But there has been good reason to be cautious. Generative artificial intelligence often doesn’t work as advertised, and there are a number of hurdles to making it safe. OpenAI has learned that lesson the hard way, as reports of AI psychosis flood the internet.
But Cook has recently signaled that he understands how transformative the tech could be for Apple, dubbing the AI revolution “as big or bigger” than the internet during a global all-hands meeting last month.
Apple’s ramp-up with AI is expected to take some time, as Bloomberg notes. The company is announcing a new iPhone next week, but the device isn’t expected to have any “major” new AI features.
Apple’s Siri overhaul may include an AI-powered web search tool with technology powered by Google’s Gemini, according to a new report from Bloomberg’s Mark Gurman. The iPhone maker, which has been criticized for falling behind in the AI race, delayed its long-awaited Siri update until 2026. In the meantime, the company has been scrambling to determine whether its own AI models alone will work well enough to make its upgraded Siri competitive with the AI answer engines available today from tech companies like OpenAI, Perplexity, and Google.
Per Bloomberg, Apple could be turning to Google for a solution to its problems. The report claims that Apple and Google reached a formal agreement this week that will see Apple testing a Google AI model in Siri. If successful, the technology could also be used in other areas of iPhone software, including the Safari browser and Spotlight search, which is available on the Home Screen.
In previous years, Spotlight seemed to be ramping up to become a rival of sorts to Google, as it allowed iPhone users to bypass web searches to get basic answers about popular topics, like information about actors, musicians, TV shows, and movies, among other things. With AI chatbots, however, consumers can now source quick answers about a wide range of topics beyond those that could be found on Wikipedia.
The report suggests that the upgraded search experience’s interface will use a combination of text, photos, videos, and local points of interest, as well as an AI-powered summarization feature. It will also be able to tap into users’ personal data and let them navigate their devices via voice.
Though Apple announced some sweeping AI changes to Siri at its WWDC event last month, it’s been unclear when these changes will reach iPhone screens. But a new report from Bloomberg chief correspondent Mark Gurman clarifies the timeline.
Gurman, who has an 86.5% accuracy rate on the Apple information he has leaked in the past, wrote in his Sunday Power On newsletter that though Apple Intelligence AI will come out this fall, the most important Siri updates will arrive next spring.
This includes Siri accessing and working with other Apple apps based on a simple command. For example, when asked, “Hey Siri, what’s my driver’s license number?” Siri will soon be able to go through the Photos app, find a picture of the license plate, and even be able to put the numbers in a web form.
Another Siri upgrade on lock until next year is Siri understanding context, or being able to interpret what an iPhone user is looking at.
In the fall, Siri will still get supercharged with ChatGPT and get a design makeover, but the voice assistant will lack those key contextual features, according to Gurman.
The Siri update won’t be complete until developers beta test the features starting in January and Apple releases the upgrade publicly in the spring of 2025, per the same report.
A person holds a phone in front of the Siri logo. Photo by Artur Widak/NurPhoto via Getty Images
The Bloomberg report aligns with Apple’s public statements.
Apple noted in its June press release that Apple Intelligence will start rolling out this fall in beta, but “some features, software platforms, and additional languages will come over the course of the next year.” The fine print leaves room for certain features to debut later, after Apple Intelligence’s fall release.
Apple stepped into the AI game nearly two years after ChatGPT was launched but decided to take a more collaborative approach to AI development. Along with creating AI in-house, Apple also chose to integrate ChatGPT directly into its products and open up the doors for AI models from other competitors.
AI could also prompt hundreds of millions of iPhone users with older models to upgrade this fall — Apple Intelligence only works on the newest iPhones.
At WWDC 2024, Apple unleashed a blitzkrieg of software updates to put AI, or “Apple Intelligence,” front and center in your iPhones, iPads, and Macs. After Samsung and Google pushed AI on phones, it’s now Apple’s turn to try and flip the script to make smartphones, tablets, and laptops “smarter” by introducing an AI of its own.
Apple Unveils Its iPhone 15 and Apple Watch Series 9
If you woke up this morning hoping for some big hardware announcement, or hell, even a hint or teaser for a new phone or Mac design, it’s best you return to your comfortable cave and hibernate until the next big Apple showcase. Regarding software, Apple Intelligence will be available in most user-end apps with automatic summarizations and AI-enhanced photo editing. ChatGPT is coming to the latest iPhones as the Cupertino, California tech giant is set to make the chatbot accessible anywhere on the phone without needing the app.
WWDC 2024 — June 10 | Apple
If you have no interest in AI, there are a few new updates to get excited about. iOS 18 and iPadOS 18 are incoming, promising some long-awaited features. One is the iPhone lock screen update, which allows users to place their widgets and icons where they want. Another is the update to Messenger that will finally enable it to use the RCS protocol. Say goodbye to those green bubbles forever.
Meanwhile, iPads and Macs are getting a few new, unexpected features, like a full-on Calculator app that supports scribbling and iPhone mirroring on macOS Sequoia. Many of these updates are slated for fall of this year, though the betas should start rolling out in the next few months.
What’s Up With ‘Apple Intelligence’
Apple Intelligence is Apple’s Big AI Product for All of its Ecosystem
Screenshot: Apple
First on the list is “Apple Intelligence.” The Cupertino company’s AI is just what it says on the tin: an entire ecosystem for navigating users’ lives. There’s a lot going into it, but—eventually—the software should be able to include multimodal AI vision capabilities and work within all the apps on your iPhone, iPad, and Mac. The only problem is that we still don’t know exactly when any or parts of these features should be available.
Apple Intelligence can Rewrite or Proofread Text
Apple promises the new AI writing tools can summarize your text and add an easy “TLDR” to the top of emails. Like Google’s Gemini, the rewriting feature could include different text styles to make it sound more “Friendly” or “Concise.” You also have the option to add tables, lists, or summaries to the text. This should work in pretty much all Apple apps and some third-party apps.
Apple’s Emails Will Summarize Important Points Before You Open them
The Priority feature in the Mail app will show you your most important emails or messages for when you have a lot of them coming in at once. These condensed notifications will show this right on the lock screen of your iPhone. This works with a new Focus that cuts down on the number of notifications and only shows the most important ones.
Apple Will Let You Create AI Images, Including ‘Genmojis’
Screenshot: Apple
Of course, Apple wouldn’t stay its hand from the AI image generation game. The new Image Playground is built into Pages, Messages, Freeform, and several other apps.
You have three styles on offer: animation, illustration, or sketch, but you have the regular prompt bar to make it create whatever (somewhat disturbing) images you desire. There are also new AI-generated emojis called ‘Genmoji,’ which will come out as a sticker or Tapback. You can also create one of your friends if you trust it enough. Apple promises all its images are generated on-device.
There’s also a new Magic Eraser-like tool in Photos to remove unwanted elements from an image before filling in those missing pixels.
The Apple Intelligence Can Pull Up Your Files and Photos
There’s a lot of big promises coming about thanks to AI. Apple claims their new AI system will eventually let the AI perform rather complex actions, like pulling up photos and files from any of your apps. It should be able to work between apps so that it will know when your meetings are and what your plans are for that day when you ask it to send a text that helps you work around your schedule.
Apple Promises Its AI Won’t Save Your Data
Some of the AI running on Apple’s devices are on-device, but those are supposed to run through Private Cloud Compute. Apple promises to maintain your privacy by determining if a request needs any off-device AI. Then, it will only send parts of the request to the cloud. Apple promises outside agencies will be able to look at Apple’s servers to verify the big privacy claims.
Siri Has a New Look and a Whole Lot More Capabilities
Screenshot: Apple
Poor, beleaguered Siri is finally receiving those long-rumored AI upgrades, but we may need to wait a long time to see them in action. The Siri updates will allow the assistant to interact with iPhone and iPad apps far more than it currently can.
For one, Siri now has a new logo and look, making the borders of the screen wavy whenever the assistant gets called up. Siri will maintain conversational context and will be able to work off your previous requests. Now you can type to chat to Siri as well. Double tapping on the bottom of the screen allows you to communicate with Siri directly.
Siri can also take actions happening on-screen. It can also take actions across apps, like adding a photo from the Photos app to the Notes app. Eventually, the idea is that Siri can take specific actions in more apps over time.
The digital assistant should also become more engrained with users’ “Personal Context.” Siri should know your emails, plans, calendar events, and texts to find all the necessary information.
Siri Will Be Your Best How-To Machine for Apple Products
Siri should be able to send you a how-to guide for anything related to your Apple products. This comes baked into Siri and will work with all the most commonly asked questions about Apple products.
Siri Can Use ChatGPT ‘Seamlessly’
Screenshot: Apple
While we don’t have a good idea when Siri will receive its most important updates, we know that the current stopgap will be ChatGPT integration directly on users’ devices. The app will be accessible straight from Siri and the new compose feature. You can use the chatbot to generate DALL-E images as well. Apple promises this integration will be powered by GPT-4o for free without paying for an account.
Apple promises your activities won’t be logged, and you can access the ChatGPT paid features if you link your account. ChatGPT integration will be coming to all the Apple ecosystem’s new updates later this year.
iOS 18 is Promising some Long-Awaited Customization Features
iOS Now Supports RCS
Screenshot: Apple
As a last-minute note to end its talk about iOS 18, Apple confirmed that the next version of iOS will support RCS protocol. There’s no word yet exactly what form this will take, though Android Authority first recognized that it could be RCS Universal Profile 2.4. This could be the true end to green bubble tyranny, but we’ll learn more as we get close to release.
iOS 18 Lets You Finally Rearrange Your Home Screen Apps
Screenshot: Apple
iOS 18 will be a big one for folks who have long demanded Android-like customizability on the iPhone. Now, you can rearrange all your apps and widgets on the home screen however you like, so you can finally frame your background wallpaper without having an app covering up your kids’ faces. Apple goes further by allowing users to set the tint and tone of the app’s icons themselves.
You Can Soon “Lock” Any App in iOS 18
The next iPhone update will allow users to lock and hide apps so anyone using your phone won’t have immediate access without biometric scanning or a PIN. Similarly, you can now hide away apps into a select hidden folder if you don’t want visitors to your iPhone to get into some of your more sensitive apps without a passcode.
Messenger Includes Full Emoji Tapbacks
Screenshot: Apple
Are you annoyed you can’t do full emoji reactions to texts like you can on Android? The iOS Talkback feature is receiving full emoji support, so you can respond to your friend’s queries with as many poop emojis as their messages require.
Messenger Text Effects Will Let You Emphasize Certain Words
The Messenger app in iOS 18 is expanding the ability to emphasize words. Now, instead of just emphasizing the names of people or other words, users can use Text Effects to make certain words blow up or jiggle. The app will automatically suggest specific effects for certain words. There are new effects, and you can add them to any text you want.
Messages are also gaining the ability to use text formatting, allowing you to underline, bold, or italicize words or phrases.
Game Mode on iPhone
Mac’s Game Mode is getting a version on iPhone. The mode should automatically kick in while in a game. This minimizes background tasks to put as much processing power into the game. It should improve latency with controllers or AirPods.
Messages Via Satellite
If you find yourself without cellphone service, Apple will let you use your iPhone to text friends and family when off the grid on Messages. You can still send emojis and Tapbacks, and Apple claims its E2E encrypted. This will only be available with the iPhone 14 or later, which comes with satellite support.
Apple Maps Now Allows You to Get Hiking Trail Info
Screenshot: Apple
Apple Maps now has access to topographic trail maps, allowing hiking loops on your phone. This will show the overall length and elevation gain of the trail or loop and the various entry points on the app.
Tap to Cash Allows You to Pay Your Friends With Your Phone
Those iPhone users keen on Apple Cash will soon be able to send money to each other using the same action you can use to send folks your contact information. Hovering both phones with the active cash app will send and receive money from your wallet. Additionally, event tickets are being redesigned to show you details about the venue and other essential information.
Photos App is Gonna Look a Hell of a Lot Different
The Photos app now has a new design that shows all your photos in a single grid. You can find different photos based on months or years and filter your photos to eschew screenshots.
The new Collections will let you section different photos into topics like People & Pets or Recent Days. This will let you see your photos in a collage. In selections like Trips, you can find your vacations or travels by date. You can also pin different collections.
The Favorites carousel now shows you a slideshow of photos from various favorite collections.
iPadOS 18 Promises Some New and Unique Features for Apple’s Tablets
Floating Tab Bar on iPad Might Make it Far Easier to Use
Screenshot: Apple
Apple is introducing a new floating tab bar for iPadOS 18. It essentially works as an easy-to-access menu that can morph into a sidebar for even more fine-tuned controls. It should work with most Apple apps on the iPad. There are also new animations to accompany this update. Apple added it’s working to make browsing through documents easier on Apple’s tablets.
SharePlay Tap and Draw Will Let You Remote Control Another iPad
The new SharePlay update will let you make annotations on a foreign device and act as a remote control for another person’s iPad. So, if you’re trying to describe to your mom how to access her iPad photos, you can use SharePlay and draw an arrow straight to them. Once you get frustrated enough, you can take control.
Calculator on iPad (‘Yay’)
Screenshot: Apple
Finally, the iPad is getting a calculator app, but it’s far more interesting than that. It may look like It also works with Apple Pencil. Math Notes comes up from the calculator button, and if you write out an equal sign, it solves it for you, updating it live depending on your different functions. It also works with lists that let you tabulate numbers rather quickly. Notes also have the same math capabilities as Calculator.
Notes’ Smart Script Will Fix Your Chicken Scratch as You Write
The AI will make your writing more legible as you go. The on-board AI should be able to take your loose handwriting and make it a bit more legible while keeping your writing “style.” You can paste it directly into the Notes app, which should mimic your handwriting style.
So, What’s New in macOS Sequoia?
macOS Sequoia Will Allow You to Mirror Your iPhone on Your MacBook
Screenshot: Apple
macOS Sequoia is getting a lot of the features you can find on Apple’s other products. Continuity will let you access universal apps on the rest of the Apple ecosystem. More importantly, it will let you mirror your iPhone on a Mac. Users can then select and work on any of the iPhone’s apps. The audio also comes through Mac.
The iPhone stays locked while you mirror it and works with Standby mode. When your phone is connected to the laptop, iPhone notifications will also appear on Mac, and when you click on them, your iPhone mirror will open up.
You Can Place Your Mac Windows into Tiles, Like Windows 11
Screenshot: Apple
macOS Sequoia is adding a few new tiling features to make organizing your desktop more seamless. Bringing a window to a corner of the screen should automatically reorient and morph to fit a clean style.
You Can Preview Your Camera When Doing a Facetime
Before hopping into a video meeting, Macs will let you preview what you look like on camera. It is better to help you fix your makeup or remember to put on a shirt. There’s also a built-in background replacer if you can’t access one in whatever app you use.
Passwords App Will Show All Your Stuff
There’s now an all-new Passwords app to act as your one-stop shop for your keychains and important, sensitive information. It should be present across the entire Apple ecosystem. This should contain everything from WiFi passwords to verification codes to Passkeys.
Safari Reader Function Summarizes Text
Screenshot: Apple
The new updates to Safari introduce several new AI functions. At the top of the list are AI-generated summaries for the content on web pages. The Reader mode changes the website’s look and brings up a table of contents. There’s no look whether it also removes ads while it’s at it.
Game Porting Toolkit 2 Adds Better Windows Compatibility
Apple first announced its Game Porting Toolkit last WWDC, and now there’s a sequel that promises to make porting more hardcore titles easier to Apple’s framework. The company detailed several new games coming to Mac, including Frostpunk 2 and Control. Assassins Creed: Shadows is also coming to iPad, and Prince of Persia: Shattered Crown is coming to Mac.
How About watchOS 11 and AirPods?
AirPods Can Sense Your Head Nods For Saying Yes to Siri
Screenshot: Apple
If you’d rather not be that asshole in the elevator talking on your Bluetooth headset, AirPods will soon get a feature that should track your head movements. If there’s an incoming call, you can nod or shake your head to respond yes or no to taking it. After it rolls out to AirPods, we’ll have to see what other uses this gesture has.
Apple Watch’s watchOS 11 Gets Training Mode
There are a few new features on the Apple Watch for those fitness fans. With Training Mode, an AI algorithm tells you what kind of effort you made during your recent exercise. This might tell you if you were going too soft or too hard on your recent workout. Plus, you can customize your Fitness app to see what kind of data you want to see at a glance.
The Next watchOS Update Includes a Vitals App
Screenshot: Apple
The Vitals app will look at your entire health data to check all your health metrics and even tell you whether your drinking has impacted your health. This might show your heart rate and tell you whether that’s in your typical range. If it’s not within normal levels, the app should give you a rundown of what’s happening and what could be causing the issue.
Apple Watch Will Open Up Different Widgets Depending on Context
Apple’s smart stacks will automatically add weather or translation widgets to your main screen if it thinks you need them. This might come up when it looks like it’s about to rain or if you’re traveling around a foreign country.
The Apple Watch Will Determine Which Photos Work Best for Your Home Screen
Screenshot: Apple
Like its new TV update, Apple Watches will look through your photos and select those with enough blank space to fit the time. It should also be able to stick the time in front or behind certain photo elements, making it look far more like the photo belongs on the home screen.
If you’d rather not be that asshole in the elevator talking on your Bluetooth headset, AirPods will soon get a feature that should track your head movements. If there’s an incoming call, you can nod or shake your head to respond yes or no to taking it. After it rolls out to AirPods, we’ll have to see what other uses this gesture has.
AirPods Pro Now Have Voice Isolation and Spatial Audio in Gaming
AirPods Pro is getting an update that will add voice isolation to remove background noise for the sake of whoever’s on the other end. Additionally, developers can access an API to add spatial audio for games. This will add a surround-sound type experience to the game, first coming to Need for Speed Mobile.
Is There Anything New Coming to Apple TV+ and Vision Pro?
AppleVision OS 2, the Squeekquel, Will Let You Project Your Mac Screen Into nearly 180 Degrees
Screenshot: Apple
Apple released Vision Pro in February, and its first major update of the year is a sequel to the first visionOS coming down the pike just a few months later.
The big new update includes several new spatial photo updates. The Vision Pro can turn 2D images into 3D-ish Spatial photos. You can share those spatial photos with SharePlay. Apple is adding a few new gestures to tap to open the home view or open the control center by turning your wrist. Later this year, Apple plans to update the OS to add better Mac screen integration. This will expand the total view of your projected Mac screen, and with dynamic foveation, it can create a wraparound screen that travels nearly 180 degrees.
InSight on Apple tvOS Will Offer a Few Details on What You’re Watching
Apple’s new InSight feature on Apple TV+ is essentially Amazon’s X-Ray. It lets you get a quick summary of the content you’re watching, plus information about the actors on screen and perhaps a little trivia about the scene as it plays. Plus, there are a few new screensaver animations, like one from Peanuts’ Snoopy, but your photos will now reframed to fit with a timestamp and look like they belong on-screen.
The ideal smart home seamlessly anticipates your needs and instantly responds to commands. You shouldn’t have to open a specific app for each appliance or remember the precise voice command and voice assistant combination that starts the latest episode of your favorite podcast on the nearest speaker. Competing smart home standards make operating your devices needlessly complicated. It’s just not very … well, smart.
Tech giants try to straddle standards by offering their voice assistants as a controlling layer on top, but Alexa can’t talk to Google Assistant or Siri or control Google or Apple devices, and vice versa. (And so far, no single ecosystem has created all the best devices.) But these interoperability woes may soon be remedied. Formerly called Project CHIP (Connected Home over IP), the open source interoperability standard known as Matter arrived in 2022. With some of the biggest tech names, like Amazon, Apple, and Google, on board, seamless integration may finally be within reach.
Updated May 2024: Added news of the Matter 1.3 specification release, progress with the major players, a section on what you can do with Matter, and more details on potential functions.
If you buy something using links in our stories, we may earn a commission. This helps support our journalism. Learn more.
Table of Contents
What Is Matter?
Matter enables different devices and ecosystems to play nicely. Device manufacturers must comply with the Matter standard to ensure their devices are compatible with smart home and voice services such as Amazon’s Alexa, Apple’s Siri, Google’s Assistant, and others. For folks building a smart home, Matter theoretically lets you buy any device and use the voice assistant or platform you prefer to control it. (Yes, you can use different voice assistants to talk to the same product.)
For example, you can buy a Matter-supported smart bulb and set it up with Apple HomeKit, Google Assistant, or Amazon Alexa—without having to worry about compatibility. Right now, some devices already support multiple platforms (like Alexa or Google Assistant), but Matter will expand that platform support and make setting up your new devices faster and easier.
The first protocol runs on Wi-Fi and Thread network layers and uses Bluetooth Low Energy for device setup. While it supports various platforms, you must choose the voice assistants and apps you want to use—there is no central Matter app or assistant. Because Matter works on your local network, you can expect your smart home devices to be more responsive to you, and they should continue to work even when your internet goes down.
What Makes Matter Different?
The Connectivity Standards Alliance (or CSA, formerly the Zigbee Alliance) maintains the Matter standard. What sets it apart is the breadth of its membership (more than 550 tech companies), the willingness to adopt and merge disparate technologies, and the fact that it is an open source project. Interested companies can use the software development kit (SDK) royalty-free to incorporate their devices into the Matter ecosystem. This is much simpler than certifying devices individually with each smart home platform.
Growing out of the Zigbee Alliance gives Matter a firm foundation. Bringing the main smart home platforms (Amazon Alexa, Apple HomeKit, Google Home, and Samsung SmartThings) to the same table is an achievement. It is optimistic to imagine a seamless adoption of Matter across the board, but it has enjoyed a rush of enthusiasm with many smart home brands jumping aboard, including August, Schlage, and Yale in smart locks; Belkin, Cync, GE Lighting, Sengled, Signify (Philips Hue), and Nanoleaf in smart lighting; and others like Arlo, Comcast, Eve, TP-Link, and LG.
When Did Matter Arrive?
Matter has been in the works for years. The first release of Project CHIP was due in late 2020, but it was delayed to the following year, rebranded as Matter, and then touted for a summer release. After another delay, the Matter 1.0 specification and certification program opened in 2022. The SDK, tools, and test cases were made available, and eight authorized test labs opened for product certification.
The first wave of Matter-supported smart home gadgets went on sale in the fall of 2022, and we have seen a steady trickle since then. The first update to the specification, Matter 1.1, arrived in May 2023 and consisted largely of bug fixes. Announced in October 2023, Matter 1.2 added support for nine new device types, including refrigerators, robot vacuums, and air purifiers, alongside improvements to existing categories.
The Matter 1.3 specification was published in May 2024, adding energy management, EV charging, and water management alongside support for new devices, including ovens, cooktops, and laundry dryers. It also brought improvements to Matter Casting, so on top of being able to cast from your phone to your TV, other smart devices—like your robot vacuum—can send messages to your TV to warn you if they’re stuck, for example.
There’s a lot riding on next week’s WWDC 2024 keynote. The presentation’s stakes are far higher than your standard post-event market moves. The pressure for Tim Cook and crew to deliver the goods is, in a very real sense, even higher than it was in the lead up to last year’s Vision Pro announcement.
On Monday, Apple will lay out its AI plans. The subject has been a massive question mark looming over Cupertino for the last few years, as competitors like Google and Microsoft have embraced generative AI. There’s a broad industry consensus that systems powered by large language models like ChatGPT and Gemini will profoundly affect how we interact with our devices.
Apple is expected to announce a partnership with OpenAI that will bring the company’s smarts to the iPhone and Mac. Apple’s near-term strategy is a deep integration between existing properties and generative AI, with Siri at the center. Since its debut in 2011, Apple has pushed to make the voice assistant an integral part of all its operating systems.
In the intervening 13 years, however, Siri has fallen short of the revolution Apple promised. There are plenty of reasons for this, though the primary is capability. The concept of an artificial voice assistant pre-dates Siri by decades, but no one fully cracked it for a reason. As phone makers and app developers have transformed smartphones into everything devices, these assistants’ jobs have become increasingly complex.
As impressive as the Stanford Research Institute’s work was, the technology required for a frictionless experience simply wasn’t ready. Siri co-founder Norman Winarsky addressed the underlying issue in 2018, noting that Apple’s initial plan was a far more limited assistant that handled things like entertainment and travel. “These are hard problems, and when you’re a company dealing with up to a billion people, the problems get harder yet,” Winarsky noted at the time. “They’re probably looking for a level of perfection they can’t get.”
Generative AI isn’t at that level of perfection, either — not yet, at least. Hallucinations are still a problem. That’s precisely why, even after the massive buzz of the past few years, it still feels like we’re very much in the baby steps phase. If anything, I would say that Google, for one, has been overly aggressive in places. The best example of this is the company’s decision to surface Gemini results at the top of searches.
When something is prioritized above trusted resources in the world’s dominate search engine, it needs to get things right as much as humanly possible, and not, you know, tell people to eat glue. Google labels Gemini results a product of its “Search Labs,” but surely a majority of users don’t understand what that means in terms of product maturity, nor can they be bothered to click through for more information.
Over the past few years, I’ve met several researchers who have used the term “magic” to describe the results of “black box” surrounding large language models. This isn’t a knock against all of the amazing work happening in the space, so much as a realization that there’s still so much we don’t know about the technology.
Arthur C. Clarke put it best: “Any sufficiently advanced technology is indistinguishable from magic.”
One place Google has been more intentional, however, is with its integration of Gemini into Android. Rather than replacing Assistant outright, Google has been integrating its generative AI platform into different applications. Users can also opt-in to making Gemini their default by assigning it to the Assistant button on Pixel devices. This implementation requires deliberate action on the user’s part at least thus far.
While Gemini hasn’t completely conquered Android yet, however, Google is clearly signaling at a day in the not too distant future when it replaces Assistant outright. I half expected an announcement along those lines at I/O last month, though I’m glad it ultimately opted to give Gemini more time to bake.
Whether the Assistant name sticks around is ultimately a branding decision. For its part, Apple is very wedded to the Siri name. It has, after all, spent well over a decade pitching the product to consumers. Sooner than later, however, generative AI will eat the smart assistant space.
Voice assistants in general are having an existential moment. Smart speakers have a broader bellwether for platforms like Siri, Alexa and Google Assistant. Shipments have declined, after heating up during the pandemic. It’s unfair to characterize the category as doomed, but it will be in the long run, without the proper shot in the arm.
Generative AI is poised to be the logical successor, but the first round of hardware devices built around these models, including the Humane Ai Pin and Rabbit R1, have only been testaments to how far the category has to go before it can be considered a consistent experience for mainstream users.
Apple will finally show its hand on Monday. While rumors point to the company transitioning a number of employees to generative AI operations following its electric car implosion, all signs point to Apple having ceded a significant head start to the competition. As such, its most logical play is a partnership with a reigning powerhouse like OpenAI.
Shortly after the Siri acquisition was announced, Steve Jobs was asked whether the company was trying to beat Google at its own game. “It’s an AI company,” Jobs noted. “We’re not going into the search business. We don’t care about it. Other people do it well.”
The company’s approach to generative AI is currently in the same place. At this stage, Apple can’t beat OpenAI at its own game, so it’s partnering instead. But even the best of the current models have a way to go before they’re ready to fully replace the current crop of smart assistants.
This guide was updated on March 28, 2024, at 2:40 p.m. ET to reflect the latest information from Bloomberg and other sources.
Can This iPad Replace Your Laptop?
The rumor tornado that has circled the upcoming OLED iPad Pro has finally started to wane, leaving us with a whole field scattered with little nuggets of information relating to its size, color options, and a few juicy details surrounding the new look on its OLED screen. The new school iPad Pro and iPad Air are supposed to drop sometime this Spring, though we may need to wait until May for the full reveal. We still have a few months to get excited about Apple’s first real push into OLED outside its phones.
Apple didn’t release any 11th-gen iPads last year, which is noteworthy considering the Cupertino, California company has released one yearly for over a decade. The rumors make it clear that Apple thinks this latest refresh is a big one, and it could possibly reinvent the somewhat confusing SKU bloat that’s hindered the tech giant’s tablet line for years.
When Could Apple Release the OLED iPad Pro?
Most initial rumors suggested that Apple would showcase its new iPads in March. However, new details are coming out of Apple’s production line, and a report from Bloomberg says we’ll have to wait until sometime in May. Based on several anonymous sources, the report notes that the May release will be a big one centered around the new iPads. The Pro models will receive a brand new screen, while fans of the iPad Airs will have a new size category to play with at 12.9 inches.
The Cupertino company just released its new M3 MacBook Airs with a 13- or 15-inch screen in March, so this push to May isn’t so surprising as the company wants to spread out its releases and stay in the media spotlight for longer. The report notes that Apple needed to finish up the software for its upcoming tablets, hence the delay. In January, Bloomberg’s Apple guru Mark Gurman reported that Apple has wide-ranging designs. Nothing’s changed as far as what’s coming down the pike. According to Bloomberg, this first iPad refresh in 18 months will include four models: the J717, J718, J720, and J721.
As first reported by Apple Insider, quoting from market researchers at Display Supply Chain Consultants, there have been a few snags with manufacturing the latest tablets. Still, now that Apple has a little more breathing room, there hopefully won’t be any more delays.
May would also be a month before its biggest event of the year, WWDC 2024. That’s where most rumors suggest Apple will introduce far more AI enhancements to iOS 18. Much hasn’t been said about AI on iPadOS, but if it’s not there to start, it will only be a matter of time before Apple slaps some version of AI features on its tablets.
Moreover, there’ve been hints at additional iPad accessories that could also find their way onto the scene, along with the new Airs and Pros. People digging into the code for iOS 17.4 found mentions of an Apple Pencil 3 that connects with Apple’s Find My app. With a surprise release last year, the previous Apple Pencil featured a slide-out USB-C slot. That pencil version also lacked pressure sensitivity, so a new version with more sensitivity options could fit well with the new “Pro” lineup.
But wait, there’s more. MacRumors claimed, based on a source who works with Apple parts, that the next iPad could support MagSafe wireless charging. There hasn’t been word that the Cupertino company would make an all-new MagSafe peripheral for iPads, but we can’t help but imagine a charging unit that could double as a hands-free stand. Bloomberg had previously hinted at Apple trying to create a glass-backed iPad that would work with MagSafe.
The new iPad could also introduce an all-new keyboard. Gurman previously mentioned that Apple is trying to redesign the Magic Keyboard to work with the iPad. Rumors suggest the new keyboard will have a larger trackpad. Most importantly, Apple could switch to aluminum for the top portion of its keyboard, which would give it much more of a MacBook feel than ever before. The cover material would remain the same, but it would make the whole keyboard a lot sturdier for those who want to use their iPad as their main daily driver.
What Do We Know About the iPad Pro’s OLED?
Photo: Caitlin McGarry / Gizmodo
Based on routine hints by industry analysts, it’s become well-known that Apple wants to make an 11.1-inch and a 13-inch iPad Pro with OLED. That’s compared to the most recent 12.9-inch mini-LED version (called Liquid Retina XDR) and 11-inch IPS LCD version that currently occupy the top end of Apple’s tablet line. Those rumors have been reconfirmed by the most recent word from
The new generation of iPads will be sized slightly differently from previous models. 9to5Mac reported, based on anonymous sources, that the new iPad Pros will be close to 1 mm less thick than the current 10th-gen. The existing 11-inch iPad Pro is 5.9 mm thick, but the new one could be 5.1 mm. The 12.9-inch version currency sits at 6.4 mm, but the new one could be a bare 5.0 mm.
To complement the new sizes, rumors also suggest we’ll see an updated MacBook Air that stretches the screen to 12.9 inches.
What’s Happening with iPadOS 18?
There could be some interesting changes in store for the next OS update to come along with the iPad refresh. For one, the next tablet operating system could drop support for several older-gen iPads. According to a rumor first reported by 9to5Mac, this includes the 2nd-gen 12.9-inch iPad Pro, the 10.5-inch iPad Pro, and the 6th-gen iPad. The rest of the tablets from 2019 and later should still have access to the new OS build.
If iOS 18‘s rumored AI enhancements prove true, it would only make sense to bring them to the iPad as well. This could potentially reinvent Siri and perhaps add several new AI-enhanced functions to Apple’s portable platforms.
At the tail end of last month, Apple released the new iPadOS update 17.4.1, which MacRumors confirmed didn’t offer any hints about iPadOS 18. At the very least, we can speculate that iPadOS 18 will come out at the same time as iOS 18, which will likely debut at WWDC 2024.
How Powerful Will the OLED iPad Pro Be?
We’ve known for a long while now that Apple wants to use its new M3 chip inside the iPad Pro. This was before we even had a chance to look at and analyze the power and capabilities of Apple’s latest M-series silicon, but since then, we’ve had the full chance to test out the capabilities of the 3nm M3 and its more powerful brethren, the M3 Pro and M3 Max.
Most configurations of the M3 for both the MacBook Air and MacBook Pro come with 8 CPU cores and either 8 or 10 GPU cores. These configurations either come with 8, 16, or 32 GB of RAM, though considering the iPad Pro goes to a maximum of 16 GB of integrated memory, you can expect the OLED version to be the same.
We’ve found the M3 chip to be pretty versatile at both productivity and graphical tasks. It’s marginally better than the M2 chip in all benchmarks, so it will certainly be an upgrade for those used to the M1 or M2 versions, even ignoring the new eye-catching display.
How Much Will the OLED iPad Pro Cost?
Photo: Caitlin McGarry / Gizmodo
OLED normally costs more than LCD, partly because of materials and partly because fewer factories and makers are available to manufacture the individual components. Based on industry sources, Korean tech rumor site The Elec (via MacRumors) claimed that Samsung is producing the first batch of the 11-inch OLED iPads but that LG is also working on the 13-inch versions.
The Elec also noted that industry analysts expect Apple to ship 8 million units this year. That’s less than what the industry thought the company would ship last year, though it may be based on expected demand more than anything. Apple did have a few issues last year with MacBook sales being down, leading the company to release the M3 MacBook Pros, not even a year after it finally started shipping the M2 version. This did help build hype around the M3 chip, the company’s most powerful APU released.
That said, the iPad Pro would only make sense to see a price increase. Trying to guess an exact price would be like tossing darts blindfolded, but the 13-inch iPad Pro currently starts at $1,099 but can go upwards of $2,000 if you want to opt for more storage options and cellular connectivity. We could guess that a new iPad Pro would cost at least $100 more than the current generation. The Elec has previously reported the next iPad could cost several hundred dollars more, even putting the price at $1,500 for the 11-inch and $1,800 for the 18-inch model, which seems to be a higher jump than seems practical.
Also, considering the Magic Keyboard’s current starting price of $300, a new aluminum material will likely increase the overall cost. The future iPad Pro will be a much more luxury product, which will also recast the Air as a more consumer-grade product overall.
A research paper quietly released by Apple describes an AI model called MM1 that can answer questions and analyze images. It’s the biggest sign yet that Apple is developing generative AI capabilities.
Berkshire Hathaway purchased 2.8 million shares of the Liberty SiriusXM tracking stock in recent days, apparently seeking to capitalize on Liberty Sirius’ discount relative to the value of its stake in Sirius XM Holdings, the satellite radio company.
Continue reading this article with a Barron’s subscription.
While it can be fun or playful to send titillating messages and photos to a partner, it’s important to have some guardrails. Here’s what you should never say while sexting.
“Good morning. You are scheduled to receive a picture of my junk. Please, reply 1 to confirm that you are horny. Reply 2 to reschedule.”
“Good morning. You are scheduled to receive a picture of my junk. Please, reply 1 to confirm that you are horny. Reply 2 to reschedule.”
It’s redundant. They already confirmed via email.
“Sorry, I’m all out of cum tonight. I have a fresh shipment arriving Wednesday, though, if you’re interested.”
“Sorry, I’m all out of cum tonight. I have a fresh shipment arriving Wednesday, though, if you’re interested.”
Wednesday isn’t soon enough. Your lover needs cum now!
“I give you scratchies behind the ear and rub your big belly.”
“I give you scratchies behind the ear and rub your big belly.”
Fine to say later on, but you need to pace yourself. Starting with something as hot and heavy as this right away will make them blow their load immediately.
“Are those nipples? What is that, a knee? Wait—am I supposed to look at this sideways?”
“Are those nipples? What is that, a knee? Wait—am I supposed to look at this sideways?”
All close-up mounds of flesh are equally sexy, so it shouldn’t matter what exactly you’re looking at.
“I’d probably ejaculate pretty quickly and then stand in front of the fridge nude while chugging blue Gatorade.”
“I’d probably ejaculate pretty quickly and then stand in front of the fridge nude while chugging blue Gatorade.”
You don’t have to be completely honest when they ask what you’d be doing if you two were together right now.
“I am excited to begin the holy act of Christian procreation between a man and a woman with you.”
“I am excited to begin the holy act of Christian procreation between a man and a woman with you.”
Please, for the love of God, don’t send this without a photo of a promise ring.
“I am an asexual sea sponge.”
“I am an asexual sea sponge.”
Then why are you sexting!?!
“I cut down on my phone bill substantially by sexting you with T-Mobile.”
“I cut down on my phone bill substantially by sexting you with T-Mobile.”
It’s hotter to build up slowly to a sales pitch instead of diving right in.
“I masturbate my elbows as you slowly lick the inside of your fingernail.”
“I masturbate my elbows as you slowly lick the inside of your fingernail.”
This is just going to give away that you don’t know what sex is.
“Siri, insert eggplant emoji. I said, Siri, insert eggplant emoji.”
“Siri, insert eggplant emoji. I said, Siri, insert eggplant emoji.”
This is not how you want to reveal to the woman you met online that you are actually 63 years old.
“HOMINA HOMINA HOMINA.”
“HOMINA HOMINA HOMINA.”
Once is fine, but resist the urge to copy and paste this response to everything the other person says.
“C: creative. U: understanding. M: magical!”
“C: creative. U: understanding. M: magical!”
Poetry is best shared face-to-face.
“Perhaps my penis should enter your vagina in a way that brings pleasure to us both?”
“Perhaps my penis should enter your vagina in a way that brings pleasure to us both?”
It’s important to check in with your lover first to make sure they enjoy pleasure.
“Please sign and return the attached PDF.”
“Please sign and return the attached PDF.”
Don’t send a nondisclosure agreement without first looping your attorney into the chat.
“*~*~*~ “I walk a lonely road / The only one that I have ever known. ~*~*~*”
“*~*~*~ “I walk a lonely road / The only one that I have ever known. ~*~*~*”
You are confusing sexts with AIM away messages again.
“Hey, I just got out of the shower, slipped, and cracked my head open on the bathroom sink. Want to see?”e
“Hey, I just got out of the shower, slipped, and cracked my head open on the bathroom sink. Want to see?”e
Don’t ask, just send that pic!
“Hey, just so we’re clear, remind me what cum is again?”
“Hey, just so we’re clear, remind me what cum is again?”
You need to do your research before you begin sexting.
“I’m going to lick you like a child licks an ice cream cone on a hot summer day. One of those halcyon days, back when everything was beautiful, everything was free. Before we started to grow older, and saw the world’s true nature: bleak, gray, and disappointing. Now here we sit, hoping to ignite something resembling joy, but what we know is just a pleasurable opiate, sedating ourselves against the abject horror of existence.”
“I’m going to lick you like a child licks an ice cream cone on a hot summer day. One of those halcyon days, back when everything was beautiful, everything was free. Before we started to grow older, and saw the world’s true nature: bleak, gray, and disappointing. Now here we sit, hoping to ignite something resembling joy, but what we know is just a pleasurable opiate, sedating ourselves against the abject horror of existence.”
Actually, this is super hot.
“I am 14 years old!”
You should tell your parents what this man who found you on Roblox has been saying.
“Not through speeches and majority decisions will the great questions of the day be decided, but by iron and blood.”
“Not through speeches and majority decisions will the great questions of the day be decided, but by iron and blood.”
It comes across as a little cheesy to quote Otto von Bismarck’s famous 1862 Blood and Iron speech
“Gimme just one sec, gotta finish doing CPR on this unconscious guy!”
“Gimme just one sec, gotta finish doing CPR on this unconscious guy!”
Typically, it’s considered unprofessional to sext at work.
“If for every time you cum, I cum four and a half times plus two times, then how many times did I cum if you came six times?”
“If for every time you cum, I cum four and a half times plus two times, then how many times did I cum if you came six times?”
It’s way too hard to do algebra while jerking off.
“Who is a horny baby? You are. You are! Coochie coochie coo!”
“Who is a horny baby? You are. You are! Coochie coochie coo!”
Not only is this wrong on so many levels, but if they are turned on by this, it’s probably illegal.
“I love you.”
You’re a liar, just like everyone else! You wouldn’t be saying that if you hadn’t met someone else you like more, you fucking cheating piece of shit. Well, just know there will be blood on your hands when they kill themselves.
“Mom, can you pick me up from soccer practice?”
“Mom, can you pick me up from soccer practice?”
Keep it in your pants, pervert! She’s your mother!