// you’re reading...


The Coming Utility of Images

Commonly, we think of photographs and images as artistic tools for communicating ideas and emotion, or to serve as a representational substitute for a physical product (e.g. To show the actual color of a blue sweater in a catalog). In the near future, owing to increased connectivity, image processing algorithms and improved connectivity, I predict images, both photographic and computer generated, will take an increased utilitarian role in our lives.

To help illustrate this concept, I’ve created a fictional scenario which extrapolates existing technologies and current image processing trends.

It’s 2016 and you’ve decided to remodel your kitchen. You make a preliminary run to Home Depot and rent the RoomScanner, a hybrid device with a LIDAR laser beam for sensing distance and an onboard still camera.

At home, you place the device on its lightweight tripod in the center of your kitchen, press start and the RoomScanner goes to work immediately. First, the laser beam sweeps through the room making a 3-dimensional grid of thousands of precise measurements throughout your kitchen—everything from the distance among the four walls to the height of the cabinets and diameter of your cabinet handles.

Next, the still camera on the RoomScanner sweeps through the room, taking a mosaic of thousands of still photos to record the color, shape, texture and position of all the items currently in your kitchen. Once completed, the light on the center of the RoomScanner turns green, indicating that all data is now ready to be downloaded.

Plugging the RoomScanner into your computer via USB automatically uploads the complete suite of RoomScanner data to a Web site where the measured information is used to create an accurate 3-dimensional wireframe grid of your entire kitchen. The still photos are used to provide the textures for the kitchen and image recognition attempts to determine the make, model and age of your appliances.

Once this 3-dimensional model is complete, you can now begin selecting items from the Home Depot Web site and virtually remodel your kitchen. The GPS data and measurement of windows allows you to see what the new kitchen will look like at noon when the light from the skylights is strongest, or at 6 p.m. in the middle of winter which can help you predict the need for more lighting on the food preparation areas. You can click the lights on in your kitchen to see how the light fixtures and bulb type change the mood and utility of the kitchen.

When you’ve completed your selections, click on the “Create A Shopping List” and your entire order is calculated. Everything from the amount of paint needed to cover the walls with two coats of paint, to the correct square footage of flooring necessary to complete your dream kitchen.

This process would take much of the guesswork out of home remodeling jobs like this one, or would allow you to work interactively with a designer and see exactly what the room will look like, from different sight lines, perspectives and times of day, without investing a lot of time or expense.

This process could even be paired with an augmented reality application for your smartphone. Launch the app, enable the camera and walk around the house and see the virtual world of your remodeled kitchen in parallel with the actual world as seen through your eyes.

It may sound fanciful, but I predict these types of applications will become commonplace in the years to come and will help us perform all kinds of tasks and jobs, like locating the correct gate in a foreign airport, selecting clothes, civic planning to educating our children. The utility of images will provide a link between the information currently found in text form in books, magazines, technical drawings and plans, and the real world, in a natural and interactive way.

Here are a few of the technologies I drew upon to create this hypothetical scenario. All are commercially available products today. If you know of others I should add to the list, please let me know.

Strata 3d: Software used for translating a series of still images into a 3d model.

Photosynth: Image recognition and placement software allowing the creation of interactive “spaces” made of dozens, hundreds or thousands of still photos.

Layar: Augmented reality browser application which uses your smartphone’s camera to overlay online information with a real-time display of the physical world.

GRAIL lab: generating 3d models of major European cities from crowdsourced photo libraries.

Related videos to watch

The Map as Information Ecology

Layar Augmented Reality Browser

GRAIL Lab; Building Rome in A Day

Radiohead Video using LIDAR measurement technology


2 comments for “The Coming Utility of Images”

  1. A very interesting article Jay! Thank for sharing!!

    Posted by George Trakakis | October 4, 2010, 3:40 pm
  2. […] the previous Utility of Images post, I discussed how smartphones will be used to help guide you through foreign airports. An […]

    Posted by Jay Kinghorn's Blog | The Utility of Images: Revisited | October 11, 2010, 11:08 am

Post a comment