Categories
Mobile Syrup

Teardown reveals how Pixel Watch fall detection will work

Back when the Pixel Watch launched, Google promised it would add fall detection to the smartwatch “this winter.” Now, it seems the feature could be right around the corner as a teardown of the Pixel Watch app revealed details of how fall detection will work.

9to5Google performed the teardown of the Pixel Watch app version 1.1. As with any teardown, it’s important to remember that code snippets and other discoveries can be misinterpreted, especially when features are still in development. Moreover, things can change before launch.

With that said, let’s take a look at what 9to5 found in the teardown. First, the publication outlined the set-up process for fall detection, sharing text that explains the feature “works by using motion sensors on your Pixel Watch.” It also warns that “high-impact activities may trigger fall detection, and your watch can’t detect all falls.”

There are three steps to activate fall detection. This includes that when the Pixel Watch detects a hard fall, it will wait 30 seconds, then vibrate, sound an alarm and check if the wearer needs help. If the wearer doesn’t respond, the Pixel Watch attempts to contact emergency services. Finally, the smartwatch will play an automated message during the call that provides the wearer’s location and requests for help. If the wearer is able, they can also speak to the emergency operator. 9to5 also uncovered what the automated message will say:

“You are being contacted by an automated emergency voice service on behalf of a caller. The caller’s watch detected a possible fall, and they were unresponsive. Please send help. Their location is 12.039578 degrees latitude, -121.947872 degrees longitude. This message will repeat 3 times.”

Finally, 9to5 also found that the fall detection feature should be available in Canada and other countries, including Australia, France, Ireland, Germany, Japan, Taiwan and the U.S.

It’s worth noting that fall detection started showing up for some Pixel Watch owners earlier this year. Since then, users have been eagerly watching for the feature to arrive, though it hasn’t yet. Hopefully, this means fall detection will arrive soon.

Source: 9to5Google

Categories
Mobile Syrup

Google Home app can now control your Xbox

Microsoft is now rolling out an Xbox update that lets users control the console from within the Google Home app.

The Google Home app remote control typically only works with Android TVs and Google TV-powered devices, but now it’ll let users turn off or on their Xbox, navigate back, go home and turn the volume up and down. Further, you’ll find a navigational touchpad and standard media controls that include play/pause and skip/previous at the bottom. You’ll also get a directional navigation touchpad and standard media controls with a play/pause and skip/previous at the bottom.

There’s also a mute/unmute option alongside a “record game clip” button.

You’ll need to refresh your devices if your Xbox appears in the Google Home app.  Although, you’ll need to update your Xbox to OS version 10.0.22621.3446 (xb_flt_2302ni.230208-1530).

Source: 9to5Google

Categories
Mobile Syrup

Google starts rolling out Privacy Sandbox beta for Android 13

Roughly a year ago, Google first detailed its ‘Privacy Sandbox‘ plan for Android. In short, the goal was to create a way for advertisers to serve relevant ads to users without hurting user privacy. Now, Google’s back with a beta for Privacy Sandbox.

When it comes to serving ads without hurting privacy, the main idea seems to be replacing cross-app and cross-device identifiers like advertising IDs with a service that estimates what your interests are and temporarily saves those on your device.

Dubbed ‘Interests estimated by Android,’ this part of Privacy Sandbox estimates users’ interests based on the apps they have installed. Then developers can leverage Google’s ‘FLEDGE’ API to show ads based on “custom audiences.”

It sounds like it will enable advertisers to target, say, users with an interest in a specific topic like smartphones, and then apps can show smartphone ads to Android users if Android determines those users are interested in that topic. (This also ties in Google’s Topics API that replaced the FLoC system Google had planned to integrate into Chrome).

Can I opt-in (or out?)

Here’s what Privacy Sandbox looks like | Image credit: Google

With that out of the way, the big question about the Privacy Sandbox beta is what it means for users. To start, Google says Privacy Sandbox will roll out “gradually.” It will start with a small percentage of Android 13 devices and expand to more over time. An Android notification will let you know if you’ve been selected (so far, it seems I have not been selected).

Google also says that apps that choose to participate in the Privacy Sandbox beta will be able to use the new APIs mentioned above to show relevant ads to users and measure how effective those ads are.

Users can control their beta participation by heading into Settings > Security & privacy > More privacy settings > Ads (it’s worth noting the exact location of Privacy Sandbox may vary depending on your device manufacturer).

The Privacy Sandbox settings will allow users to turn the feature on or off,  as well as manage the interests and the apps that can use those to show relevant ads. For example, users can block an interest, which will prevent Android from adding that interest to the list again (though Google says users might still see relevant ads).

It’ll be interesting to see how Privacy Sandbox works out. I’m not convinced it will address the privacy concerns, though it may prove a little better than the existing system of relentless tracking.

Source: Google Via: 9to5Google

Categories
Mobile Syrup

Google releases quick fix to solve Google Photos issues on iOS devices

Following the iOS 16.3.1 update across Apple devices, the Google Photos app came across a fairly troublesome bug. Users reported a persistent issue that caused the app to crash. Naturally, this prevented users from accessing their Google Photos library on iPhone. Thankfully, Google is providing a fix.

Early Tuesday morning, Google launched its version 6.23.1 update for the Google Photos app on iOS. “We fixed user-reported issues and added new features to provide an enhanced editing experience,” reads the What’s New section. Since updating the app, it’s reported that users are finding the app to be much more stable.

As of the time of writing, it still remains hard to pinpoint how widespread the issue was. Based on affected users’ testimonies, Other Google suite apps, including Gmail, Google Docs and Google Maps, were all in working order. Google Photos, on the other hand, was found to crash regularly.

While Google did provide a solution for the issue, there’s still no telling what caused it in the first place. iOS 16.3.1 wasn’t a major software update and was designed to fix bugs. It’s not known whether this update is what caused an issue for Google Photos or not.

As long as users have Photos version 6.23.1 installed on their devices, they should not run into any further issues. Users can double-check if it is installed by navigating to ‘Settings’ and selecting the ‘Google Photos’ app. At the bottom of the page, the app’s current version is displayed.

Source: The Verge

Categories
Mobile Syrup

Android 14 includes in-development features to convert, transfer eSIMs

Android 14 may pave the way for other smartphone makers to follow Apple in going all-in on eSIM.

For those unfamiliar with eSIM, it’s a smartphone technology that replaces the need for a physical SIM card, meaning no need for a SIM card tray in your phone. Unsurprisingly, there are pros and cons to this — eSIMs enable less waste and some proponents have suggested eSIMs could reduce carrier control over customers by making it easier to switch providers. However, restrictive activation practices and other complications have prevented that from being the case.

For years, most smartphones have offered both eSIM and physical SIM options, but Apple removed the physical SIM on its iPhone 14 line in the U.S., for better and worse. (Here in Canada, the iPhone 14 line still offers a physical SIM card).

However, the first developer preview of Android 14 includes an updated Settings app with new options for eSIM. As spotted by Mishaal Rahman, senior technical editor at Esper, the new settings app contains an option called ‘Convert to eSIM’ (via Android Police). Although not totally clear, it seems this would convert a physical SIM card into an eSIM. The related settings page appears to still be in development and only shows up in search.

Moreover, Rahman says Android 14 DP1 preps an option to transfer eSIMs between different devices. However, these new settings options don’t work on their own and require additional resources. Rahman found the resources in the pre-installed SIM manager app that Pixel phones use for various background SIM card and network-related features. Android Police noted these resources were added as early as the second Android 13 QPR2 beta in January.

On the surface, these improvements sound like they could address some of the pain points of eSIM, but Rahman notes that carriers will need to support activating newly created eSIMs — given how carriers are, I’m sure you can guess how this will go. Similarly, transferring an eSIM from one device to another isn’t part of the eSIM spec, so it’s not clear whether this will be a Pixel-specific feature or if it will work for Android devices in general. The iPhone has a similar issue where it can transfer eSIMs from iPhone to iPhone, but not to or from Android.

You can learn more about Android 14 DP1 here.

Source: Mishaal Rahman Via: Android Police

Categories
Mobile Syrup

It looks like the Pixel 4a will just miss Android 14

While smartphone nerds are digging into Android 14 DP1, Pixel 4a owners are sadly left out of the fun.

As spotted by Android Police, Google’s excellent Pixel 4a likely won’t make the jump to Android 14 because of Google’s lame update policy. This time around, however, it stings a little more — the Pixel 4a turns three years old in August, meaning it will just miss eligibility for Android 14.

Google has offered three years of OS upgrades for a while now (starting with the Pixel 6, it tacks on five years of security patches too). Unfortunately, this update policy means the Pixel 4a  will miss the Android 14 update. Google’s Android 14 timeline points to an August or September release for the new OS. Moreover, Android 14 DP1 isn’t available on the Pixel 4a.

What’s worse is there’s nothing actually preventing the Pixel 4a from running Android 14. Sure, it’s not the most powerful smartphone, thanks to its Snapdragon 730G chipset, but we also know these decisions aren’t based on a phone’s capability. Otherwise, the Pixel 4 and 4 XL — which are more powerful than the 4a — wouldn’t have just got their last patch.

Anyway, Google should really improve it’s update policies to at least be in line with other Android manufacturers, like Samsung (who’d have thought Samsung would be the model for Android updates?).

As for Pixel 4a owners, if you don’t want to get a new phone, you may want to start learning how to flash ROMs to squeeze more life out of your device.

Source: Android Police

Categories
Mobile Syrup

Android 14 DP1 is available now — here’s what you need to know

The first Developer Preview of Android 14 has arrived, kicking off a months-long testing phase where we’ll get to see much of what’s new in Android ahead of the stable release later this year.

Google released several details about Developer Preview 1 (DP1) as well as a timeline for the preview and beta period ahead on the Android Developers website. Plus, several publications have already started sharing new features and other details about Android 14 based on what they’ve found in DP1.

When will Android 14 release?

The short answer is in August or September 2023, but if you want more detail, read on.

Google published a timeline overview for Android 14 that gives us a pretty good idea how many previews and betas we’ll get, as well as when we’ll get them. DP1 and DP2 will come out in February and March, respectively, and then in April we should get the first Android 14 Beta, with further releases coming roughly every month.

Timeline for Android 14 developer previews and betas

Timeline for Android 14 developer previews and betas.

Google expects to hit platform stability in June with Beta 3 — this is when the search giant will lock in the Android 14 APIs and developers can start building apps knowing there won’t be any further changes. Android 14’s final release will come sometime after July — likely August, as in past years.

How to get Android 14 DP1

Getting a developer preview isn’t the easiest thing for most people, so if you’re not a technical user, I’d recommend waiting until the beta releases start coming out. Not only will those be more stable than developer previews, but you can easily enroll your Pixel phone through Google’s Android Beta Program website without wiping your data or flashing anything.

As for the developer previews, you’ll need to head to the Android Developers website and download the compatible Android 14 DP factory image for your Pixel phone. The following Pixel phones are supported:

You can find the Android 14 DP1 factory images here — you’ll need to download them and flash them, which will wipe your data (that said, you shouldn’t install the developer preview on your daily driver device in the first place). Once flashed, you’ll get future Android 14 updates delivered over the air.

We won’t get into the detailed step-by-step instructions on this process, but there are plenty of excellent guides online — such as this one from Android Police.

What’s new in Android 14 DP1?

Below is a non-exhaustive list of what’s new in Android 14 DP1. We’ll update this as new features and changes are discovered.

  • Optimizations for devices with large screens.
  • Improvements to standby battery life through changes to how Android communicates app changes.
  • Increased text zoom up to a maximum of 200 percent (previously, 130 percent was the max).
  • Android 14 will block installation of apps targetting older versions of Android — learn more.
  • Battery usage page improvements.

Source: Google Via: Android Police

Categories
Mobile Syrup

Google unveils its AI chatbot Bard and feature updates at Paris event

Following in Microsoft’s footsteps, Mountain View, California-based Google today, at its Live from Paris event, announced AI-focused updates to Search, Maps and Translate, via The Verge.

Just two days ago, on Monday, February 6th, Google announced that its ChatGPT competitor ‘Bard’ will be available to the public in the “coming weeks,” with Google CEO Sundar Pichai describing it as an “experimental conversational AI service” powered by LaMDA. The chatbot made an appearance at the event, where Prabhakar Raghavan, senior vice president at Google, said that users would be able to interact with Bard to explore complex topics, collaborate in real-time and get new and creative ideas.

Google then explained how some questions have No One Right Answer, or ‘NORA.’ This is applicable to questions like “what is the best constellation to look at when stargazing.” The answer to such questions is subjective, and hence, it has NORA. To help answer such queries, Google is introducing generative AI directly into Search results.

Soon, if you ask Google Search questions that have NORA, the new generative AI features would organize complex information and multiple viewpoints and opinions, and combine them in your search results.

Here are some of the other new announcements made across Google’s platforms, with some features releasing in the near future, and others in the coming weeks and months:

The Street View and Live View mixture, called Immersive View, is now beginning to roll out in five cities, namely London, Los Angeles, New York City, San Francisco, and Tokyo. The feature will next expand to Florence, Venice, Amsterdam, Dublin, and more.

The multi-search tool that allows users to initiate a search using an image and a few words of text is also receiving an update. The feature allows users to take photos of objects like food, supplies, clothes and more, and add the phrase “near me” in the Google app to get search results showcasing local businesses, restaurants or retailers that carry that specific item. The feature was limited to the United States, but is now rolling out globally wherever Google Lens is available. The feature will also be available on mobile web globally in the next few months.

Under Google Maps, the company is adding new features to assist EV drivers, including suggested charge stops for shorter trips, filters for “very fast” charging stations and indications of places with chargers in search results for places like hotels and grocery stores.

Further, ‘Translate with Lens’ for images is now rolling out globally. Normally, if you’d translate the text on an image, the translated text would be added on top of the image as ‘extra text,’ and wouldn’t be blended in. This would block or distort the image behind the text. Now, with a new machine learning tool called ‘Generative Adversarial Networks’ (the same tool used in Pixel phones for Magic Eraser), Google Lens can translate the text, and blend it back into the background image without distorting it, making the image retain its natural look.

ETAs and turn-by-turn navigation via Google Maps would now be visible on your lock screen. The feature is also compatible with iOS 16’s Live Activities. Google Translate will also be introducing additional context and information for certain words or phrases, starting with English, French, German, Japanese, and Spanish languages in the coming weeks. The new Google Translate design for Android will be available on iOS in the near future.

Follow the links to learn more about the new Search, Maps and Translate features.

Image credit: Google 

Source: Google, Via: The Verge

Categories
Mobile Syrup

It costs Google $413 USD to produce a Pixel 7 Pro: report

Counterpoint Research released a new report breaking down the cost of the mmWave variant of the Pixel 7 Pro. It costs around $413 USD (about $553.85 CAD) to manufacture and uses about 50 percent of Samsung-made components.

Spotted by 9to5Google, the Counterpoint Research report breaks down the bill of materials for the mmWave Pixel 7 Pro to reveal which components came from which manufacturers and how much everything cost. However, it’s worth keeping in mind that the $413 figure only covers the production cost, leaving out things like marketing, research and development, and other production-related fees.

Moreover, Google offers Pixel 7 and 7 Pro models, a set with Sub-6 5G and a set with both Sub-6 and mmWave 5G. The mmWave models (GQML3 and GE2AE, respectively) generally aren’t available in Canada, given that Canadian networks don’t yet offer mmWave 5G. Presumably, the mmWave models cost more to make, so the $413 figure likely represents the upper end of production costs for Google’s phones.

With that in mind, it’s particularly interesting to see that in the parts breakdown, the Pixel 7 Pro display takes up about 20 percent of the cost to produce the device. Samsung makes the QHD+ AMOLED display used in the Pixel 7 Pro.

Google’s custom Tensor G2 processor and Titan M2 security chip account for seven percent of the total component cost. Google developed the processor jointly with Samsung, and it’s manufactured using the 5nm extreme ultraviolet lithography (EUV) process. Google’s Tensor G2 cost an estimated $10 more to produce than the first-gen Tensor chip.

Samsung also makes the camera sensors used in the Pixel 7 Pro (main, zoom, and selfie), as well as the components needed for telecom connectivity, though the mmWave antennae are developed jointly with Murata. Samsung and Micron jointly provide the Pixel 7 Pro’s RAM.

There are several components sourced from companies other than Samsung too. This includes Skywrks for the Wi-Fi connectivity and SK Hynix for the 128GB NAND Flash storage modules built with the UFS Gen 3.1 standard. Sunwoda Electronic packages the battery while ATL supplies the battery cells, and NXP and IDT supply the Quick Charging IC and wireless charging coil, respectively.

You can check out the full report here.

Source: Counterpoint Research Via: 9to5Google

Categories
Mobile Syrup

Apple hosting Employee-Only AI Summit at Apple Park next week

Apple is reported to be holding an in-person AI summit at the Steve Jobs Theater next week. Nestled in Apple Park, an employee-exclusive event appears to mark a return of Apple’s pre-COVID events model.

Many large tech companies are getting entangled in the conversation of AI. Apple is among them as Bloomberg’s Mark Gurman claims an event “like a WWDC but for AI” is around the corner. However, rather than invite media and developers to hear about new innovations, Apple’s summit will apparently be “only for Apple employees.”

The AI summit is due to be held next week in Cupertino, California at the company’s headquarters. Apparently, the event will primarily involve employees attending. However, it’s said to be streamed as well. Gurman makes a note stating this is “essentially how Apple held media events pre-Covid.” Prior to the pandemic, Apple commonly invited journalists to the Steve Jobs Theater to witness the reveal of new products like the iPhone. Live stream options were also available. The upcoming summit may signal a return to a bygone approach to events.

As for what will be discussed at the AI summit is still anyone’s guess. Given this is an employee-exclusive event, it’s likely that consumer-focused innovations will not be announced. Instead, Apple will probably discuss further investments in machine learning and how AI can benefit Apple’s core development.

Apple is far from the only company with AI on its mind in 2023. OpenAI’s ChatGPT is gaining more traction following its launch two months ago. Microsoft is already looking to make significant investments into ChatGPT, including possible integration into its Bing browser. Google is also looking at AI with its ChatGPT competitor, Bard. Announced this week, Bard may soon be able to refine search queries and answer them “in a human-like manner.”

Image credit: Apple

Source: @markgurman