Apple plans to launch transitional AR glasses similar to Meta’s Ray‐Ban collaboration

Blink, and you’ve already missed the next big thing. AI, automation, quantum computing—while you’re scrolling past headlines, others are leveraging new tech to get ahead. Instead of scrambling to catch up, why not have the latest breakthroughs delivered straight to you? These handpicked tech newsletters keep you ahead of the curve, so you never miss an innovation that could change everything. Stop playing catch-up—start leading the trend.  Stay ahead of the game—subscribe now!

In mid‑March 2025, Bloomberg’s Mark Gurman reported Apple executives are exploring a so‑called “transitional” pair of AR glasses akin to Meta’s Ray‑Ban Stories, designed to bridge the gap between the Vision Pro headset and fully capable smart eyewear. Sources say the prototype would resemble conventional frames with integrated bone‑conduction audio, dual outward‑facing cameras for pass‑through video, and a lightweight build for all‑day comfort, offloading heavy compute to a paired iPhone or dedicated puck. The strategy echoes Meta’s low‑price Ray‑Ban collaboration but could leverage Apple’s custom silicon and visionOS for a more unified experience . Tim Cook is said to view the device as a critical stepping‑stone to mass‑market AR, aiming to gauge consumer appetite and spur developer interest before shipping a standalone glass‑only model in 2027. However, analysts caution Apple’s history of shelving ambitious wearables—like its recent cancellation of Mac‑tethered AR glasses—means the timeline and feature set remain fluid .

In early April 2025, President Donald Trump publicly acknowledged that his administration granted temporary tariff exemptions on electronics—including smartphones, laptops, processors and memory chips—after direct discussions with Apple CEO Tim Cook. During a press briefing alongside El Salvador’s president, Trump quipped, “I speak to Tim Cook… I helped Tim Cook recently,” framing the carve‑out as part of his “flexible” approach to trade policy.

The exemptions, which exclude key Apple components from both the 10 % global levy and the steep reciprocal duties on Chinese imports, were backdated to early April, allowing companies to reclaim duties already paid and avoid future surcharges. Trump described the move as a way to avoid “hurting anybody,” while stressing that flexibility doesn’t equate to wavering on core tariff objectives.

Industry observers see the intervention as a high‑profile example of corporate influence over trade policy, underlining Apple’s importance to US jobs and supply‑chain resilience. Critics argue the ad hoc nature of these exemptions injects uncertainty into broader international trade relations and may invite legal challenges from trading partners.

In early April 2025, Apple unveiled a novel on‑device training mechanism for Apple Intelligence, allowing model refinement directly on iPhone hardware with zero personal data sent to cloud servers. The system compares anonymized representations of user emails, messages and behavioural signals against synthetic datasets, transmitting only encrypted embeddings — never raw content — back to Apple .

With this system, developers can integrate Apple Intelligence into visionOS and iOS 19 apps to deliver personalised features — from smarter photo curation to context‑aware Siri suggestions — while upholding Apple’s strict privacy ethos . Only abstracted signals about which synthetic inputs align with local usage are sent back, enabling improvements in email summarisation, predictive typing and search without exposing any user content . Users who do not wish to participate can decline the Device Analytics opt‑in, ensuring no personal data is ever analysed .

Perplexity’s Sonar models have surged ahead in the latest AI Search Arena benchmarks, demonstrating that depth of research is as important as raw accuracy. In control experiments, Sonar‑Reasoning‑Pro‑High cited two to three times more sources than comparable models, resulting in richer, more verifiable responses that earned it the top slot against challengers like Google’s Gemini and OpenAI’s ChatGPT Search .

Beyond sheer citation volume, Sonar also excelled in nuanced query handling—its ability to weave multiple perspectives into a single answer gives users a broader context without needing to click through numerous links. When rankings were normalized for citation count, Perplexity still maintained a lead, underscoring that its underlying retrieval and ranking algorithms deliver genuinely deeper search experiences. Observers say this shift reflects a growing demand for “answer engines” that prioritize transparency and source attribution over traditional link lists. With plans to integrate live data feeds and expand its context window, Perplexity is positioning Sonar as the new gold standard for AI‑driven search.

In mid‑April 2025, Microsoft unveiled a groundbreaking “computer use” feature for Copilot Studio, enabling AI agents to interact directly with both websites and desktop applications via simulated clicks, menu selections and text entries .

These agents can now automate complex workflows—data entry across legacy systems, invoice processing in desktop accounting software, and market research on dynamic web dashboards—without requiring any API integrations, simply by “seeing” and acting on screen elements as a human user would . The system even adapts to UI changes (e.g. relocated buttons or updated menus), ensuring robust operation amid interface updates.

Unlike the consumer‑focused “Actions” capability in Copilot, which integrates with a limited set of partner services, Copilot Studio’s computer use offers broad compatibility across virtually any website or application, making it an enterprise‑grade automation tool. Early access applications are open now, with detailed demos scheduled for Microsoft Build 2025 in May and documentation already live on the Copilot blog.

That’s a wrap for today! Stay informed and ahead—catch us tomorrow for more insights. Share & subscribe at 🔗 www.technologyinsightsdaily.com. 🚀