technology

Here Come the Holograms by Geoff Kim

As we stand on the brink of a new era in augmented reality (AR), Meta’s latest AR glasses prototype Orion, equipped with advanced wave guide technology, marks a significant milestone. These glasses project realistic holograms onto the real world, envisioning a future where they could replace smartphones within the next 3-5 years. However, with this groundbreaking technology comes a pressing need for thoughtful regulation to ensure safety and prevent potential hazards in our increasingly virtual interactions.

The Rise of Hyper-Realistic Holograms

Meta’s AR glasses represent a leap forward in blending the digital and physical realms. The wave guide technology allows for seamless augmentation of holograms over our everyday environment, making interactions with virtual objects more intuitive and immersive. While this innovation promises enhanced user experiences and convenience, it also raises significant safety concerns that must be addressed through regulation.

Safety Risks of Overly Realistic Holographic UIs

One of the primary safety issues arises when holograms become indistinguishably realistic. Imagine a virtual table projected onto a living room floor. If a user mistakenly perceives the holographic table as real, they might drop a physical object onto it, leading to confusion and potential injury. Such scenarios highlight the blurred lines between the virtual and the tangible, underscoring the necessity for regulations that ensure users can easily differentiate between real and virtual elements.

The Imperative for Regulatory Frameworks by 2028

By 2028, as holographic UIs become more integrated into daily life, establishing robust regulatory frameworks will be essential. These regulations should focus on:

  • Visual Distinction: Ensuring holograms have clear visual indicators that differentiate them from real objects. This could include subtle outlines, colour variations, or motion cues that signal their virtual nature.
  • User Awareness: Mandating user education and interface designs that promote awareness of the holographic environment. Interfaces should inform users when they are interacting with virtual elements to prevent accidental mishaps.
  • Safety Standards: Developing and enforcing safety standards for holographic projections to minimise the risk of accidents in public and private spaces. This includes guidelines on the placement, size, and behavior of virtual objects.

Potential Solutions for Enhanced Holographic UI Safety

To effectively distinguish virtual objects from real ones and enhance user safety, several innovative UI solutions can be implemented:

  • Universal Lighting Indicators: Establishing a standardised lighting method for all virtual objects can serve as a universal cue. For example, virtual objects could emit a consistent glow or shimmer that real objects do not, making it easier for users to recognise their virtual nature instantly.
  • Colour Coding and Hues: Utilising different colour hues to signify various types or statuses of virtual objects can provide intuitive understanding. For instance:
    • Green Hues: Indicate interactive elements or navigational aids.
    • Red Hues: Warn of potential hazards or restricted areas.
    • Blue Hues: Represent informational displays or notifications.
  • Motion Patterns: Incorporating specific motion patterns or animations for virtual objects can further distinguish them from static real-world items. For example, floating virtual objects might pulsate gently or exhibit subtle movements unique to their functions.
  • Symbolic Overlays: Adding symbolic overlays or icons to virtual objects can provide additional context. A virtual table might display a small holographic icon indicating it’s virtual, reducing the likelihood of accidental interaction.
  • Audio Cues: Complementing visual indicators with audio cues can reinforce the distinction between real and virtual objects. Soft sounds that accompany virtual objects when interacted with can alert users to their non-physical nature.

Balancing Innovation with Safety

While the potential for AR glasses to replace smartphones is immense (with over a 70% likelihood pending advancements in device miniaturisation and battery life) balancing innovation with safety is paramount. Manufacturers and developers must collaborate with regulatory bodies to implement safety measures without stifling technological progress. This balanced approach will foster trust and ensure the widespread adoption of AR technologies without compromising user safety.

Societal and Ethical Considerations

Beyond physical safety, the regulation of holographic UIs touches on broader societal and ethical issues. Ensuring that augmented reality enhances human experiences without creating new vulnerabilities is crucial. Ethical guidelines should address concerns related to privacy, data security, and the psychological impacts of living in a blended virtual-physical world.

Conclusion

As AR technology rapidly evolves, the integration of realistic holograms into our daily lives presents both incredible opportunities and significant challenges. The development of comprehensive regulations by 2028 will be essential to safeguard users and ensure that the transition to augmented reality is both smooth and secure. By proactively addressing safety concerns and establishing clear standards, we can embrace the future of holographic UIs with confidence and responsibility.

For more insights on the intersection of technology, safety, and design, explore geoff.kim and stay updated with the latest discussions on the Naked Tech Podcast.

Endiatx’s Tiny Robot Pill by Geoff Kim

Exploring inner space?

In the evolving landscape of medical technology, Endiatx’s tiny robot pill represents a leap forward in diagnostics. This innovative device, equipped with cutting-edge cameras and sensors, is designed to be swallowed, offering a non-invasive way to examine the body’s internal workings.

Advanced Sensing Technology

Endiatx’s tiny robot pill is packed with high-resolution cameras and sensors capable of capturing detailed images and data as it travels through the gastrointestinal tract. These sensors can detect conditions ranging from inflammation to early signs of cancer, providing critical information for diagnosis and treatment.

Automation and Precision

A standout feature of this tiny robot is its automation capabilities. Unlike traditional endoscopies, which can be uncomfortable and require manual operation, the Endiatx pill navigates autonomously through the digestive system. This ensures a thorough and precise examination, reducing human error and improving diagnostic accuracy. Real-time data transmission to healthcare professionals allows for immediate analysis and intervention if necessary.

Minimally Invasive and Patient-Friendly

Traditional diagnostic procedures, such as endoscopies and colonoscopies, often involve significant discomfort and preparation. Endiatx’s swallowable robot pill offers a minimally invasive alternative. Patients simply swallow the pill, which then conducts its examination as it naturally progresses through the digestive system. This patient-friendly approach broadens the accessibility of vital diagnostic procedures.

Potential Future Hardware Innovations

Building on the success of Endiatx’s tiny robot pill, several future hardware innovations could emerge:

  • Micro-Surgical Robots: Tiny robots capable of performing minor surgeries internally.
  • Internal Monitoring Devices: Long-term implantable sensors for continuous health monitoring.
  • Smart Drug Delivery Systems: Pills that release medication at specific locations.
  • Biopsy Robots: Miniature robots for less invasive biopsies.
  • Precision Imaging Devices: Enhanced imaging for more accurate diagnoses.

A New Era in Medical Diagnostics

Endiatx’s tiny robot pill signifies a new era in medical diagnostics, combining advanced sensing technology with automation to provide a comprehensive and less invasive method for internal examination. This technology is set to revolutionise healthcare, making diagnostics more efficient, accurate, and accessible.


For more on the intersection of technology and healthcare, stay tuned to geoff.kim, and don’t forget to check out the latest episodes of the Naked Tech Podcast.

The Technosphere: Understanding Our Planet's Largest Life Form by Geoff Kim

In a recent episode of the Lex Fridman Podcast, Lex delves into the fascinating concept of the "technosphere" with Sara Imari Walker, an advocate of this thought-provoking idea. As someone deeply interested in the intersection of technology and life, this conversation resonated with me on multiple levels. Here’s my take on their discussion.

The Technosphere: A New Perspective on Life

Sara Imari Walker introduces the technosphere as the integration of life and technology on Earth, suggesting it is the largest and most alive entity we know. This idea challenges our traditional views of life, which typically focus on individual organisms or ecosystems. By considering the technosphere, we begin to see our technological creations not as separate from nature but as extensions of it.

Time as a Gigantic Object

One of the most compelling points in the discussion is the perception of time. Walker suggests that if we could perceive time fully, we would see the universe as a gigantic, interconnected object. This view implies that every moment and interaction is part of a vast, intricate timeline, fundamentally altering our understanding of existence. It’s a reminder that our actions today are threads in the larger fabric of history.

High-Dimensional Life

Walker posits that life is a high-dimensional phenomenon, where different aspects of an entity can be alive to varying degrees. This perspective complicates the binary understanding of life and non-life, suggesting a spectrum based on historical causation embedded in entities. It’s a nuanced view that invites us to consider how tech and biology intertwine in more sophisticated ways.

The Role of Technology in Evolution

The conversation also touched on how our technological advancements are not just tools but active participants in evolution. The technosphere is seen as a dynamic, evolving system that shapes and is shaped by human activity. This symbiotic relationship suggests that our future is not just biological but deeply intertwined with our technological creations.

Interconnectedness of Life

Finally, Walker emphasises the interconnectedness of all life forms through time and evolution. This interconnectedness means that life cannot be understood merely at the individual level but must be considered across various scales, from microbial to planetary. It’s a holistic view that resonates with the idea of the technosphere, highlighting the intricate web of connections that define our existence.

Embracing the Technosphere

As we continue to advance technologically, it’s crucial to understand and embrace the concept of the technosphere. By recognising our technology as an integral part of life's fabric, we can make more informed decisions about our future and our place in the universe. For more musings on tech and design, stay tuned to geoff.kim. And don’t forget to check out the Naked Tech Podcast for the latest in tech news and geek culture.

App Intents: Ushering a New Era of Semantic Understanding in Apps by Geoff Kim

In the ever-evolving landscape of technology, Apple's introduction of the App Intents framework marks a pivotal moment for app developers and users alike. This innovation isn't just a feature—it's the dawn of a new field rooted in the semantic understanding of user actions. Coupled with Apple's recent announcement of "Apple Intelligence," this framework is set to revolutionise how we interact with our devices.

A New Field: Semantic Understanding

The App Intents framework is designed to interpret and act upon user commands with a depth of understanding that feels almost human. It's about moving beyond simple voice commands to grasp the intent behind those commands. Imagine telling Siri, "Order my usual coffee," and it not only knows which coffee shop you prefer but also which specific drink you want, thanks to the semantic context it has learned over time.

This shift means that app developers now need to delve into the intricacies of user language. It's no longer sufficient to just recognise keywords; apps must understand the nuances of user requests. This semantic layer transforms how we interact with our devices, making the experience more intuitive and personalised.

The Power of Apple Intelligence

At its annual Worldwide Developers Conference (WWDC), Apple unveiled Apple Intelligence, its branded 'personal intelligence system' that will be deeply integrated into its platforms. Apple Intelligence is built on a family of generative models created by Apple, including on-device and server foundation models. These models are designed to deliver useful and relevant intelligence right where you need it.

The on-device model, with approximately 3 billion parameters, is optimised for speed and efficiency, achieving a time-to-first-token latency of 0.6 milliseconds per prompt token and a generation rate of 30 tokens per second on the iPhone 15 Pro. This foundational language model (OpenELM) will be leveraged to deliver quick actions on an iPhone, ensuring that the device can handle everyday activities like summarisation, mail replies, and proofreading with impressive speed and accuracy.

The Role of App Developers

For developers, this framework brings exciting opportunities and challenges. They must now design apps that can seamlessly interpret and execute a wide range of user commands. This involves integrating natural language processing (NLP) capabilities and building robust models that can learn from user interactions.

Consider a fitness app that, when asked, "How did I do last week?" can pull up a detailed summary of your workouts, highlight your progress, and even suggest improvements. Or a travel app that understands, "Book me a flight to my next meeting," and can automatically find flights based on your calendar events and preferences.

User Language and Voice Commands

At the core of this transformation is the need for developers to understand user language deeply. It's about creating an app that can converse naturally with the user. This requires a blend of technology and psychology: understanding how people express their needs and designing responses that feel natural and helpful.

Siri and voice commands at the operating system level are becoming more sophisticated, allowing for a more fluid interaction between the user and the app. This means developers must ensure their apps are not only responsive but also capable of engaging in meaningful dialogue.

The Future of Interaction

The future of app interaction is bright with the App Intents framework and Apple Intelligence. We are moving towards an era where our devices understand us better than ever before. This framework is paving the way for apps that are smarter, more responsive, and deeply attuned to our needs.

For me, this is reminiscent of the first time I used a touch screen—it just felt right. The App Intents framework, powered by Apple Intelligence, promises to deliver that same sense of seamless interaction, making our digital experiences more natural and integrated into our daily lives.

To all the developers out there, it's time to dive into the world of semantic understanding and leverage the power of Apple Intelligence. Embrace the challenge, and let's build the future of apps together.


For more musings on tech and design, stay tuned to geoff.kim. And if you're into the latest in tech news and geek culture, don't forget to check out the Naked Tech Podcast where Kelvin and I break down all the keynotes and announcements by the major technology companies.

Futures made of virtual augmentality by Geoff Kim

DamiLee has created a fantastic video essay exploring how VR/AR technologies may transform our spaces and environments. I've always envisioned these possibilities, but this is the best articulation I've seen of how an augmented future could take shape.

Some key points from the video:

  • Headsets on public transit will be unsettling as they hide emotions and intentions. Apple's Eyesight feature transmits users' eye expressions to others could help with this.

  • VR and AR allow multiple subjective spaces within one physical space. People sitting together could have totally different realities.

  • Stores may use AR to create hyper local experiences that can only exist in that location.

  • Homes may change dramatically without TVs as the focal point. More modular and movable furniture could arise.

  • "Filling in the gaps" with AR - overlaying virtual elements between real objects - could increase believability and enrich experiences.

  • AR could creatively transform overlooked urban areas like alleys and parking lots at low cost.

  • Thoughtful VR/AR implementation could help democratise shaping environments and empower more people to influence visual fabric of communities.

  • The ‘wealthy’ can buy spaces with less virtual ads.