Saturday, April 20, 2024
Search
  
Tuesday, December 8, 2009
 Google Search Adds Twitter-Facebook-MySpace Feeds, Launches "Goggles"
You are sending an email that contains the article
and a private message for your recipient(s).
Your Name:
Your e-mail: * Required!
Recipient (e-mail): *
Subject: *
Introductory Message:
HTML/Text
(Photo: Yes/No)
(At the moment, only Text is allowed...)
 
Message Text: Google has added real time results to its Internet search engine, channeling feeds from Facebook, MySpace, Twitter. The company also announced "Goggles", an application that lets people search the Internet using mobile telephone cameras or spoken words in multiple languages.

On December 7, Google held a launch event at the Computer History Museum in Mountain View, CA. The company introduced new features that bring users' search results to life with a dynamic stream of real-time content from across the web. Now, immediately after conducting a search, users can see live updates from people on popular sites like Twitter and FriendFeed, as well as headlines from news and blog posts published just seconds before. When they are relevant, Google said that it would rank these latest results to show the freshest information right on the search results page.

The real-time search enables users to discover breaking news the moment it's happening, even if it's not the popular news of the day. For example, in the screen shot below, the big story was about GM's stabilizing car sales, which shows under "News results." Nonetheless, thanks to Google's real-time algorithms, the "Latest results" feature surfaces another important story breaking just seconds before: GM's CEO stepped down.



Click on "Latest results" or select "Latest" from the search options menu to view a full page of live tweets, blogs, news and other web content scrolling right on Google. Users can also filter their results to see only "Updates" from micro-blogs like Twitter, FriendFeed, Jaiku and others. Latest results and the new search options are also designed for iPhone and Android devices.

And, as part of the launch of real-time on Google search, Google has added "hot topics" to Google Trends to show the most common topics people are publishing to the web in real-time.

Google has already announced partnerships with Facebook, MySpace, FriendFeed, Jaiku and Identi.ca and Twitter.

The new features will be rolling out in the next few days and will be available globally in English.

Use pictures to search the web

Google has also made some new strides with mobile search. Today's sensor-rich smartphones are redefining what "query" means. Beyond text, users can now search by a number of new modes including voice, location and sight ? all from a mobile device. So Google has been working to improve technology that takes advantage of these capabilities.

Starting today, Google is extending its voice search capabilities on Android devices to recognize Japanese. In addition, Google is using the location of users' mobile phone to launch some helpful features, like showing them "what's nearby."

Finally, Google demonstrated Google Goggles, a visual search application that lets users search for objects using images rather than words, using their camera phone (works with Android 1.6+ devices).

"When you connect your phone's camera to datacenters in the cloud, it becomes an eye to see and search with. It sees the world like you do, but it simultaneously taps the world's info in ways that you can't. And this makes it a perfect answering machine for your visual questions," said Vic Gundotra, Vice President of Engineering, Google.

"In a nutshell, Goggles lets users search for objects using images rather than words. Simply take a picture with your phone's camera, and if we recognize the item, Goggles returns relevant search results," Vic added.

Right now Goggles identifies landmarks, works of art, and products (among other things), and in all cases its ability to "see further" is rooted in powerful computing, pervasive connectivity, and the cloud:

Google first sends the user's image to Google's datacenters. Tehre, signatures of objects are created in the image using computer vision algorithms. Gogle is comparing the signatures against all other known items in Google's image recognition databases and then figures out how many matches exist. One or more search results are returned, based on available meta data and ranking signals. Google said that all these are done in a few seconds.



"Today Goggles recognizes certain images in certain categories, but our goal is to return high quality results for any image. Today you frame and snap a photo to get results, but one day visual search will be as natural as pointing a finger -- like a mouse for the real world," Vic said.
 
Home | News | All News | Reviews | Articles | Guides | Download | Expert Area | Forum | Site Info
Site best viewed at 1024x768+ - CDRINFO.COM 1998-2024 - All rights reserved -
Privacy policy - Contact Us .