Reply
Mon 29 Nov, 2004 10:44 am
Detection of 'counterfeit reality' becoming a new specialty
Monday, November 22, 2004 - 08:00 AM
By Chris Cobbs
Knight-Ridder Newspapers
During the past decade, the DNA technology used to solve crimes and settle paternity suits has become a big business. The federal government alone spent $232 million this past fiscal year promoting the use of a technology that barely existed 20 years ago.
Now two information-technology experts with Florida ties are predicting the use of digital forensics to police -- what they call ``counterfeit reality'' -- will soon join DNA science as a growth industry.
A coming explosion of counterfeit reality -- the use of computers and digitally based media to produce fake images, video, documents or sounds -- will drive a multibillion-dollar business of detecting what is real and what is not, say Daryl Plummer and Frank Kenney, analysts with Gartner Inc., a market-research firm based in Stamford, Conn.
``In a decade, a person will be able to create a movie of you in a place you never were, doing something you never did, and you won't even be able to tell it wasn't you,'' said Plummer, who manages Gartner's emerging trends and technologies division.
Fueling the growth of this counterfeit reality: the proliferation of digital cameras, digital camcorders and the computers and software that allow even nonspecialists to produce convincing fakes, Plummer said during a recent visit to Orlando for a technology symposium.
Plummer and Kenney, a Gartner research analyst, think the fake-detection industry will become a multibillion-dollar-a-year business during the next decade. It may start with government agencies such as the FBI and the U.S.
Patent and Trademark Office, they said, but corporations and digital-sleuthing entrepreneurs are expected to enter the field once enough high-tech tools are developed to ferret out imperfections in highly realistic fakes.
That is why Hollywood-doctored photos and moving pictures have been around almost as long as photography and filmmaking.
Hollywood movies have always relied on ``special effects,'' though computers and digital technology have greatly expanded their use in terms of duplicating actual humans.
When actor Brandon Lee died in 1993 before finishing ``The Crow,'' the film's producers created a digital composite of the actor in a computer to complete his performance. A single actor was used to create legions of digital Agent Smiths to assault Keanu Reeves, who played Neo in ``The Matrix'' trilogy. And several years ago, ``Final Fantasy'' featured an entire cast of photo-realistic human characters created by computer.
Likewise, Madison Avenue has been serving up commercials with counterfeit reality. Recent examples include the late Steve McQueen driving a 2005 model Ford Mustang and the late Fred Astaire dancing not with Ginger Rogers but with a vacuum cleaner.
A different and much grimmer form of picture editing took place in the Soviet Union during the Stalin era, when people who fell from favor were removed from reissued group photos even as they were shipped off to Siberia or worse.
Nowadays, almost anyone with the right equipment can combine pieces of different photos into a single, realistic image or create artificial movies purporting to show real individuals in embarrassing or criminal situations.
The Internet, for example, was recently used to distribute a seemingly authentic image of John Kerry and actress Jane Fonda together at a 1970s anti-war rally, as well as a photo of President Bush ostensibly sitting by a young student while reading an upside-down schoolbook.
The Kerry-Fonda photo would have been difficult to spot as a fake using present technology alone, said Kenney, a 35-year-old graduate of the University of Tampa. The key evidence came from individuals who were present at what turned out to be separate rallies; their testimony made it clear Kerry and Fonda had not shared a dais as shown in the photo. Eventually the two photos used to make the composite surfaced, proving it a fake.
Political dirty tricks aside, digital mavens with a dark streak or a profit motive represent a growing threat.
Authentic-looking photos of 19th century personalities such as Abraham Lincoln have been sold on the Web for thousands of dollars, only to be unmasked as modern, digital creations. And Plummer envisions criminals or unscrupulous homeowners bolstering fraudulent insurance claims by using a digital camera and computer to generate photos of a house with apparent -- but nonexistent -- hurricane damage.
``It's inevitable that counterfeit reality will enter the world's collective consciousness over the next decade,'' said the 44-year-old Plummer, a Florida State University grad.
``It may not be as common as Internet spam or identity theft, but it will be very impactful on socio-economic events, requiring new methods of detection and regulation.''
The FBI and local police agencies are now using digital forensics to analyze documents, photos and sounds, but new techniques will have to be developed to spot phony works, said Plummer, a former division director and technology coordinator for the Florida Department of Management Services.
``As counterfeit reality gets more sophisticated, we will need sophisticated new ways to detect it,'' he said.
Experts now can analyze a suspect image by examining the lighting, reflections and shadows, searching for telltale signs the photo has been doctored, said Paul Henry, senior vice president of CyberGuard, a Fort Lauderdale, Fla., security firm.
Jagged or irregular lines in objects can also be a tip-off portions of a picture have been faked.
Digging deeper, digital detectives can also employ advanced technology such as ``digital skinning,'' fractal imaging, behavioral artificial intelligence, wire-frame modeling and digital compositing to analyze a suspect image, Plummer said.
Such technology is already used by digital artists to inject some realism into animated cartoons, such as the individual hairs that react realistically when a character moves in Shrek 2, he said. The same techniques help build realistic-looking crowds and other backgrounds for video games such as Madden NFL 2005.
Digital forgers also pose a challenge to the legal system, which often draws on photographs and videos for crucial evidence during criminal and civil trials, said Richard Ford, research professor at the Center for Information Assurance at Florida Institute of Technology.
Experts who can detect modified images will be needed to help judges and jurors separate authentic images and sounds from the phony ones, he said.
``It behooves us to be thinking about the ways technology can be abused, so we can get there before the bad guys,'' he said. ``It's always easy to underestimate the power of where computers can go.''
As digital-detection businesses emerge, new laws and policies will also be needed to combat counterfeit reality, said an Orlando security expert who seconds Plummer's call for regulatory oversight.
``Phony reality is here to stay, and it will grow dramatically,'' said John Matelski, chief security officer for the city of Orlando.
``Too few of us recognize that the current laws are inadequate. We need a federal, multi-agency commission to establish protocols for detection and prevention of fraudulent images.''
He said the Federal Communications Commission should also establish disclaimers to be shown whenever counterfeit-reality ads are shown on TV.
-----------------------------------
Photo/video: A specialist can use software to look at different aspects of the image. For example, a house with a fence, grass and a child standing in front can be examined for subtle differences in the lighting from object to object, which may indicate that one or more pieces of the image were lifted from another photo. If the texture of the paint on the house is too uniform, it could indicate the structure was ``built'' by a graphic artist in a computer rather than a contractor on site.
Audio: A specialist checking the veracity of someone's recorded remarks might start by filtering out the subject and listening to the background noise instead. Subtle changes in those secondary sounds -- or obvious ones, such as a dog going silent in midbark -- could indicate two recordings were digitally spliced together in an attempt to put words in the subject's mouth.
Source: Gartner Inc.