Categories
Mobile Syrup

Google races to modernize Search Engine with AI features to meet competition

In March this year, Google realized that its search engine is facing the biggest threat that it has in 25 years.

It learned that Samsung was considering replacing the default Google Search engine with Bing on its devices, as reported by The New York Times.

According to internal messages reviewed by the publication, Google’s first reaction to the news was “panic,” and rightfully so. If Samsung switches to Bing as its default engine, Google’s $3 billion (roughly $4 billion CAD) in annual revenue would be at stake. “An additional $20 billion is tied to a similar Apple contract that will be up for renewal this year,” wrote the publication.

To counter the competition, Google is currently racing to build an all-new search engine powered by AI technology.

Codenamed Project Magi, designers, engineers and executives at Google are reportedly working in so-called sprint rooms to tweak and test the latest version of the AI search engine. According to the publication, the new search engine would offer users a far more personalized experience than the current search engine.

Lara Levin, a Google spokeswoman, said in a statement that “not every brainstorm deck or product idea leads to a launch, but as we’ve said before, we’re excited about bringing new A.I.-powered features to search, and will share more details soon.”

The new AI tools for the Google Search engine are anticipated to be made available to the public next month, with further updates and features scheduled for release in the fall. The report also suggests that the company will release the tool to roughly one million people initially and expand to 30 million by the end of 2023. In the beginning, the feature will remain exclusive to U.S. users.

Read the full The New York Times report here.

Source: The New York Times

Categories
Mobile Syrup

Google will show extreme heat alerts in Search soon

Over the coming months, Google will roll out extreme heat alerts in Search that aim to help people stay safe during heat waves, alongside AI-powered tools to help communities handle hotter temperatures.

According to Google, to stay safe during extreme weather, people often turn to the internet to ask questions. In order to display authoritative and helpful information for scenarios where people search for information on extreme heat, Google will now show alerts that detail when a heat wave is predicted to start and end, tips on staying cool and other health related concerns to be aware of.

“To make sure the information is relevant and accurate, we’re working with the Global Heat Health Information Network (GHHIN),” wrote Google.

Further, according to Google, cities around the world are looking for ways to prevent “heat islands,” which are urban areas with higher temperatures due to structures like roads and buildings that absorb heat and re-emit it.

Tree Canopy is part of Google’s Environmental Insights Explorer, and it combines AI and aerial imagery to help cities understand their current tree coverage and better plan future urban forestry initiatives. “The City of Austin has already used the tool to prioritize planting trees in vulnerable areas of the city and even used it to help place bus shelters to increase shade,” wrote Google.

Now Google has announced that it is expanding Tree Canopy from 14 cities to nearly 350 cities globally, including Toronto, with plans to expand to more cities this year.

Other than that, it is also leveraging its AI algorithms and aerial imagery to help urban planners and governments to identify areas that would benefit the most from ‘cool roofs.’ For reference, cool roofs are designed to reflect sunlight and absorb less heat.

Check out Google’s full blog post to learn more about its initiative to help people stay safe during heat waves.

Image credit: Google

Source: Google

Categories
Mobile Syrup

Google unveils its AI chatbot Bard and feature updates at Paris event

Following in Microsoft’s footsteps, Mountain View, California-based Google today, at its Live from Paris event, announced AI-focused updates to Search, Maps and Translate, via The Verge.

Just two days ago, on Monday, February 6th, Google announced that its ChatGPT competitor ‘Bard’ will be available to the public in the “coming weeks,” with Google CEO Sundar Pichai describing it as an “experimental conversational AI service” powered by LaMDA. The chatbot made an appearance at the event, where Prabhakar Raghavan, senior vice president at Google, said that users would be able to interact with Bard to explore complex topics, collaborate in real-time and get new and creative ideas.

Google then explained how some questions have No One Right Answer, or ‘NORA.’ This is applicable to questions like “what is the best constellation to look at when stargazing.” The answer to such questions is subjective, and hence, it has NORA. To help answer such queries, Google is introducing generative AI directly into Search results.

Soon, if you ask Google Search questions that have NORA, the new generative AI features would organize complex information and multiple viewpoints and opinions, and combine them in your search results.

Here are some of the other new announcements made across Google’s platforms, with some features releasing in the near future, and others in the coming weeks and months:

The Street View and Live View mixture, called Immersive View, is now beginning to roll out in five cities, namely London, Los Angeles, New York City, San Francisco, and Tokyo. The feature will next expand to Florence, Venice, Amsterdam, Dublin, and more.

The multi-search tool that allows users to initiate a search using an image and a few words of text is also receiving an update. The feature allows users to take photos of objects like food, supplies, clothes and more, and add the phrase “near me” in the Google app to get search results showcasing local businesses, restaurants or retailers that carry that specific item. The feature was limited to the United States, but is now rolling out globally wherever Google Lens is available. The feature will also be available on mobile web globally in the next few months.

Under Google Maps, the company is adding new features to assist EV drivers, including suggested charge stops for shorter trips, filters for “very fast” charging stations and indications of places with chargers in search results for places like hotels and grocery stores.

Further, ‘Translate with Lens’ for images is now rolling out globally. Normally, if you’d translate the text on an image, the translated text would be added on top of the image as ‘extra text,’ and wouldn’t be blended in. This would block or distort the image behind the text. Now, with a new machine learning tool called ‘Generative Adversarial Networks’ (the same tool used in Pixel phones for Magic Eraser), Google Lens can translate the text, and blend it back into the background image without distorting it, making the image retain its natural look.

ETAs and turn-by-turn navigation via Google Maps would now be visible on your lock screen. The feature is also compatible with iOS 16’s Live Activities. Google Translate will also be introducing additional context and information for certain words or phrases, starting with English, French, German, Japanese, and Spanish languages in the coming weeks. The new Google Translate design for Android will be available on iOS in the near future.

Follow the links to learn more about the new Search, Maps and Translate features.

Image credit: Google 

Source: Google, Via: The Verge

Categories
Mobile Syrup

The Last of Us takes over Google Search with creepy easter egg

Google now features a little fungal easter egg to celebrate HBO’s new The Last of Us series.

When you search “The Last of Us” on Google, a little mushroom will pop up. Clicking the fungus will then cause an outgrowth to appear on your screen. Amusingly, you can keep clicking the icon and the infection will grow and spread to cover more and more of the Google page.

Of course, the easter egg is referencing the Cordyceps infection that decimates humanity in The Last of Us. Notably, Cordyceps is a real life fungus that affects insects and arthropods, but The Last of Us takes creative liberty and extends that to humans.

Based on PlayStation’s 2013 game of the same name, HBO’s The Last of Us just premiered its second of nine episodes on January 22nd. In the series, a hardened smuggler is hired to escort a teenager across a pandemic-ravaged U.S.

The Last of Us was created by Neil Druckmann (the original PlayStation game’s writer/co-director) and Craig Mazin (Chernobyl) and stars Pedro Pascal (The Mandalorian) and Bella Ramsey (Game of Thrones).

The series is currently streaming on Crave. For more on The Last of Us, check out our interview with Druckmann, Mazin, Pascal and Ramsey.

Image credit: HBO

Categories
Mobile Syrup

Google to introduce ‘related topics’ to Search in the ‘coming days’

Google is making it easier for users to find the exact search results that they’re looking for.

Originally, when users search for something on Google Search, they can filter the results to just show ‘Videos,’ ‘Images,’ ‘News,’ ‘Books,’ ‘Shopping,’ and more. Starting now, users would see an easy-to-scroll list of related topics alongside the above-mentioned filters at the top of the Search result page, as shared by Google in its latest blog post.

As an example, if you’re searching for “Dinner Ideas,” expect to see related topics like ‘healthy,’ ‘easy,’ ‘quick,’ and more at the top of the results. Tapping on any of the related topics adds to your query, allowing you to refine your search without having to type out your complete query.

“Topics are dynamic and will change as you tap, giving you more options and helping you explore new areas. For example, if you tapped on ‘healthy,’ you might see ‘vegetarian’ or ‘quick’ appear next,” says Google. 

The way Google shows the relevant related topics is based on what it understands about how people search and from analyzing content across the web.

“Both topics and filters are shown in the order that our systems automatically determine is most helpful for your specific query,” says Google. “If you don’t see a particular filter you want, you can find more using the “All filters” option, which is available at the end of the row.”

The feature is rolling out over the coming days for English users in the U.S. for the Google app on iOS and Android, as well as for mobile web users.

Learn more about related topics here.

Image credit: Google

Source: Google

Categories
Mobile Syrup

Apple’s Google search competitor is ‘at least four years away’

Rumours that Apple is working on its own Google search-like platform have appeared again, this time courtesy of The Information.

The report mentions that as part of this effort, Apple acquired AI news startup Laserlike back in 2018, a company founded by several former Google engineers. However, The Information says that these engineers have returned to Google, including Srinivasan Venkatachary, who reportedly took on the role of Apple’s search team lead. This division of the tech giant works on features like Spotlight and Siri.

“Venkatachary is now vice president of engineering at Google under James Manyika, senior vice president of technology and society, whose portfolio includes a group tasked with tracking how technologies such as artificial intelligence are affecting social issues, according to a person with knowledge of the situation,” writes The Information.

This development is especially interesting because Google reportedly pays Apple roughly $18 billion to $20 billion per year (via 9to5mac) to be the default search engine on the former company’s devices. It’s likely that if Apple’s search engine ambitions come to fruition, this agreement would be quickly severed.

Finally, The Information mentions that Apple is still roughly four years away from launching a Google search rival.

Hopefully, if Apple’s search engine eventually launches, it isn’t as bad as Apple Maps was when it was first released.

Source: The Information Via: 9to5Mac

Categories
Mobile Syrup

New Google Search features help you ask the right questions

At its annual Google Search On event, Google showcased its efforts to make Search more natural and intuitive for users, with the company’s theme being to ‘search outside the box,’ and ‘Organize the world’s information and make it universally accessible and useful.’

The tech giant outlined about upcoming developments for Google Search, Google Maps and Google Shopping.

Google Search

Multi-Search Near Me: Announced at Google I/O 2022, the multi-search near me feature allows users to take photos of objects like food, supplies, clothes and more, and add the phrase “near me” in the Google app to get search results showcasing local businesses, restaurants or retailers that carry that specific item. Google runs the image through its database and cross-references it with several photos, reviews and web pages to deliver accurate nearby results. The feature is rolling out in English in the U.S. this fall.

Translate with lens: According to Google, translations with Google Lens are used roughly one billion times per month in more than 100 languages. Originally, the way the feature worked was that it would translate the text on an image and add it on top of the photo as extra text. This would block or distort the image behind the text. Now, with a new machine learning tool called ‘Generative Adversarial Networks’ (the same tool used in Pixel phones for Magic Eraser), Google Lens can translate the text, and blend it back into the background image without distorting it, making the image retain its natural look.  See the screenshot below for reference. ‘Lens AR translate’ is rolling out globally later this year.

Shortcuts: Launching today, September 28th, you’ll see some of Google’s most helpful tools and shortcuts, including translate, ‘shop your screenshots,’ ‘hum to search,’ and more, displayed right on top of the Google app (under the Search bar). The shortcut feature is rolling out for the iOS application in English in the United States.

Easier to ask questions: Google is making it easier for users to ask the questions they intend to ask. When you start typing a query, Google will now suggest keywords or topics that might help you better craft your questions. “Say you’re looking for a destination in Mexico. We’ll help you specify your question — for example, ‘best cities in Mexico for families’ — so you can navigate to more relevant results for you,” says Google. The highlighted keywords, and the subsequent results will be relevant to your query, and will include content from creators available online. “For topics like cities, you may see visual stories and short videos from people who have visited, tips on how to explore the city, things to do, how to get there and other important aspects you might want to know about as you plan your travels,” says Google.

Additionally, the search results you’ll see will come in various formats. Depending on the relevancy of your query, the results can be shown as text, images, videos or a combination of those formats. This new format of asking questions on Google Search will roll out in the U.S. in the coming months.

Google Maps

Google intends to make Maps more visual and immersive, and it is starting off with new features that aim to make you feel like you’re physically in the location that you are searching.

Neighbourhood Vibe: Neighbourhood vibes allow you to quickly check popular locations in a given area, like famous restaurants, cafes, museums and other landmarks that people search for. Results appear alongside helpful photos and information right on the map. The feature is launching globally on Android and iOS in the coming months.

Landmark Aerial View and Immersive view: The upcoming feature allows users to see photorealistic aerial views of up to 250 key global landmarks like the CN Tower, Eiffel Tower, Tokyo Tower and more. Being able to check out the aerial view allows you to also soak in the landmark’s surroundings, making it easy for you to decide whether you want to visit said location or not.

Immersive View, on the other hand, works in tandem with Aerial View, and allows users to plan their trip ahead of time by checking what the weather, traffic and crowd will be like at a given place on a given day and time.

For example, you may search for the CN Tower in Toronto and see that finding parking around the landmark is a pain, so you decide not to go or take public transport.

Landmark Aerial View is rolling out globally on Maps today, September 28th, whereas Immersive View is launching in Los Angeles, London, New York, San Francisco and Tokyo in the coming months.

Live View: First released three years ago, Google’s Live View is an Augmented Reality tool that overlays walking directions over your live camera feed. All you need to do is simply lift and point your phone to find directions, and essential places like shops, ATMs and restaurants. The feature has long been available in Canada, and will start rolling out in London, New York, Paris, San Francisco and Tokyo in the coming months.

Eco-Friendly Routing expanding to developers: Google launched its Eco-Friendly Routing feature for Google Maps in Canada earlier this year. The feature allows users to analyze the estimated carbon emissions of their planned route and suggests alternative directions that would consume less fuel, while taking traffic, road steepness and other variables into account.

Developers from companies will now have the option to add the feature to their company apps, allowing companies like SkipTheDishes, Uber Eats, and Amazon delivery drivers to select their engine type and find best routes and directions for fuel and energy efficiency. The feature will be available in Preview later this year for developers in the U.S., Canada and select countries in Europe where Eco-Friendly Routing is available.

Google Shopping

Google is introducing new tools for Shopping that aim to help users discover things they’ll like by providing visual cues, insights about products and services, and filters to trim down options.

3D shopping for shoes: Some websites allow users to take a 3D look at shoes from different angles. According to Google, people engage with 3D images almost 50 percent more than static ones.

Websites need to have this feature built-in, and some of the smaller businesses don’t have the resources to do that. This is where Google comes in. The company says it can now automate 360-degree spins of shoes using just a handful of photos, essentially the ones already available on the merchant’s website, allowing users to check out shoes in 3D view, regardless of the page offering the feature. This feature will be available in Google Shopping in the U.S. in early 2023.

Shopping Insights: The new feature will provide noteworthy insights about the product category you’re browsing for.  Google says insights will be gathered from a “wide range of trusted sources,” and compiled in one place. For example, if you’re searching for mountain bikes, Google will provide insights about the size of the bike you should be riding, information about brakes and suspension, popular trails around you and more.

The feature is rolling out for Google Shopping today, September 28th.

Dynamic shopping filters: Normally, filters allow you to sort results by price and relevancy. With new shopping filters on Google, users would be able to shop categorically. For example, if you’re looking for jeans, you’ll be able to sort them by ‘skinny,’ wide leg,’ ‘boot cut,’ and more. Similarly, for T-shirts, filters such as round neck, crew neck, baggy, and more would allow you to tailor the Google Shopping experience according to your needs.

Dynamic Filters for Google Shopping will be “available soon” in select countries in Europe.

Categories
Mobile Syrup

Google rolls out safety feature allowing users to request personal information removal

Some Google users can now remove search results that detail personal identifiable information, following Google’s announcement of the “Results About You” feature at its 2022 I/O event in May.

According to Google’s support page, information that can be removed includes bank account numbers, images of ID documentation or handwritten signature, and phone numbers.

This slideshow requires JavaScript.

Users can access the safety feature by selecting their profile picture on the Google app. A menu page titled “Results About You” will open up, detailing how to send a removal request from Google.

If you find a page that contains personal information, you can select the menu icon (three verticle dots on the top right corner) that’s next to the result, and select “remove result.”

Once submitted, users can monitor the progress of their requests.

The feature seems to only apply to Android users at this time.

Image credit: Google

Via: 9to5Google

Categories
Mobile Syrup

Google reportedly testing launching cloud games directly from Search results

Google is currently testing a way for users to be able to launch games via the cloud directly from the Google Search results, as first shared by Bryant Chappel from The Nerf Report in a Twitter thread.

Until now, Google Search results for games only show general information about the title and platforms it’s available on, but soon, searching for a cloud-based game on Google Search will present a play button tied to cloud streaming platforms.  The functionality seems to be working for Google’s own Stadia, Amazon’s Luna (not available in Canada), Nvidia’s GeForce Now and Microsoft’s xCloud.

GeForce Now only appears to support Fortnite for now, and according to Chappel, it might be up to the developer to make the title playable directly from the search results.

Upon trying to use the feature, MobileSyrup did not get an option to launch games directly from the search results, which likely means that the functionality is either geo-locked or is being tested with certain users only. It is also likely that users need to be logged in to the specific cloud gaming platform before attempting to search it on Google. Nevertheless, this only adds to the appeal of cloud gaming that allows users to play titles without having to download them, irrespective of the specifications of the machine they’re on.

It is not currently clear when or if Google will roll out this feature widely.

Image credit: @BryantChappel

Source: @BryantChappel

Categories
Mobile Syrup

Google’s ‘Multisearch’ feature can help you find things nearby

1At Google I/O 2022, the Mountain View, California-based company revealed several new features coming to Google Search, Google Maps and Google Assistant.

Google Search is offering a new feature in the Google app called ‘multisearch.’ With the functionality, you’ll be able to take a picture of things like apparel, home goods and food, or screenshot the Google Search bar and then add the words “near me.’ Then Search will show options for a local restaurant or retailers that have it available.

Google will scan millions of images, reviews, web pages and more from the community of Maps contributors to find more about these nearby spots.

This feature will be available later this year in English and will expand to more languages in the future.

Google is also working on a new scene exploration feature. When you’re in a book store, for example, and you’re looking at a wall of books, you’ll be able to snap pictures and search for the exact book you’re looking for.

Google says it’s using computer vision, natural language understanding and the knowledge of the web as well as local on-device technology.

This technology won’t be available for a while, according to the tech giant.