News > Google


Fast Drawing for Everyone

Category: Google | Apr 11, 2017

Drawing on your phone or computer can be slow and difficult—so we created AutoDraw, a new web-based tool that pairs machine learning with drawings created by talented artists to help you draw.

AutoDraw_1.gif

It works on your phone, computer, or tablet (and it’s free!). So the next time you want to make a birthday card, party invite or just doodle on your phone, it’ll be as easy and fast as everything else on the web.

Fast Drawing for Everyone

If you’re interested in learning more about the magic behind AutoDraw, check out “Quick, Draw!”  (one of our A.I. Experiments). AutoDraw’s suggestion tool uses the same technology to guess what you’re trying to draw.

Big thanks to the artists, designers, illustrators and friends of Google who created original drawings for AutoDraw.

HAWRAF, Design Studio
Erin Butner, Designer
Julia Melograna, Illustrator
Pei Liew, Designer
Simone Noronha, Designer
Tori Hinn, Designer
Selman Design, Creative Studio

If you are interested in submitting your own drawings, you can do that here. We hope that AutoDraw, our latest A.I. Experiment, will make drawing more accessible and fun for everyone.

From: http://feedproxy.google.com/~r/blogspot/MKuf/~3/y_dFX5QC1e4/

Fast Drawing for Everyone

Category: Google | Apr 11, 2017

Drawing on your phone or computer can be slow and difficult—so we created AutoDraw, a new web-based tool that pairs machine learning with drawings created by talented artists to help you draw.

AutoDraw_1.gif

It works on your phone, computer, or tablet (and it’s free!). So the next time you want to make a birthday card, party invite or just doodle on your phone, it’ll be as easy and fast as everything else on the web.

Fast Drawing for Everyone

If you’re interested in learning more about the magic behind AutoDraw, check out “Quick, Draw!”  (one of our A.I. Experiments). AutoDraw’s suggestion tool uses the same technology to guess what you’re trying to draw.

Big thanks to the artists, designers, illustrators and friends of Google who created original drawings for AutoDraw.

HAWRAF, Design Studio
Erin Butner, Designer
Julia Melograna, Illustrator
Pei Liew, Designer
Simone Noronha, Designer
Tori Hinn, Designer
Selman Design, Creative Studio

If you are interested in submitting your own drawings, you can do that here. We hope that AutoDraw, our latest A.I. Experiment, will make drawing more accessible and fun for everyone.

From: http://feedproxy.google.com/~r/blogspot/MKuf/~3/rCxi_ZhNdT8/

Taking aim at annoying page jumps in Chrome

Category: Google | Apr 11, 2017

Have you ever opened a link shared by a friend to an article you were eager to read, scrolled to the second paragraph, and found yourself suddenly back near the top of the page, as if everything had shifted beneath you?

These annoying page jumps typically happen when the website inserts an image or other content above the visible area, pushing down what’s on the screen. With the newest Chrome update, we’re introducing something called scroll anchoring, which locks the content you’re currently looking at to the screen, keeping you in the same spot so you can keep reading. Check out a side-by-side comparison, without and with scroll anchoring:

Scroll anchoring is one of our favorite kinds of features—those that shine when no one notices them. Today we’re preventing an average of almost three “jumps” per pageview, and we’re still getting better. If you’re a web developer or you’d like to learn more, see our technical guide to understand how it works and what it means for your website.

From: http://feedproxy.google.com/~r/blogspot/MKuf/~3/O1rp_28IZvQ/

Taking aim at annoying page jumps in Chrome

Category: Google | Apr 11, 2017

Have you ever opened a link shared by a friend to an article you were eager to read, scrolled to the second paragraph, and found yourself suddenly back near the top of the page, as if everything had shifted beneath you?

These annoying page jumps typically happen when the website inserts an image or other content above the visible area, pushing down what’s on the screen. With the newest Chrome update, we’re introducing something called scroll anchoring, which locks the content you’re currently looking at to the screen, keeping you in the same spot so you can keep reading. Check out a side-by-side comparison, without and with scroll anchoring:

Scroll anchoring is one of our favorite kinds of features—those that shine when no one notices them. Today we’re preventing an average of almost three “jumps” per pageview, and we’re still getting better. If you’re a web developer or you’d like to learn more, see our technical guide to understand how it works and what it means for your website.

From: http://feedproxy.google.com/~r/blogspot/MKuf/~3/_t5258lJ5-M/

Google Cloud expands Education Grants Program to 30 additional countries

Category: Google | Apr 11, 2017

This month the Google Cloud team attended the Special Interest Group on Computer Science Education (SIGCSE), a conference that brings together 1,200 computer science (CS) professors from around the world. We had the chance to learn from professors who are leading CS innovation at more than 500 universities worldwide. At Google, we understand the critical role professors play in enabling students to build what’s next. Last summer we launched the Google Cloud Platform Education Grants for computer science for professors in the United States. We’re excited to extend this program to 30 new countries across continental Europe, the UK, Israel, Switzerland and Canada.

University professors, who teach CS or related subjects and are from qualifying countries, can apply for grants to support their courses. Through the Google Cloud Platform (GCP) Education Grants program,  professors and their students can access GCP to use the same infrastructure, analytics and machine learning that we use across Google to power our innovation. Whether it’s launching an app seamlessly with Google App Engine or using our Cloud Machine Learning tools, including the popular Cloud Natural Language API or Cloud Vision API, you can incorporate Google’s state-of-the-art image recognition capabilities into web apps.

Computer science professors in certain European Union countries, the UK, Israel, Switzerland and Canada can apply here for Education Grants. Others interested in GCP for Higher Education should complete this form to stay up to date with the latest from Google Cloud.

We look forward to seeing the new ways professors and students will use their GCP Education Grants. We’ll share stories about cool projects on this blog and our social channels.

From: http://feedproxy.google.com/~r/blogspot/MKuf/~3/coW70IDzdpA/

Google Cloud expands Education Grants Program to 30 additional countries

Category: Google | Apr 11, 2017

This month the Google Cloud team attended the Special Interest Group on Computer Science Education (SIGCSE), a conference that brings together 1,200 computer science (CS) professors from around the world. We had the chance to learn from professors who are leading CS innovation at more than 500 universities worldwide. At Google, we understand the critical role professors play in enabling students to build what’s next. Last summer we launched the Google Cloud Platform Education Grants for computer science for professors in the United States. We’re excited to extend this program to 30 new countries across continental Europe, the UK, Israel, Switzerland and Canada.

University professors, who teach CS or related subjects and are from qualifying countries, can apply for grants to support their courses. Through the Google Cloud Platform (GCP) Education Grants program,  professors and their students can access GCP to use the same infrastructure, analytics and machine learning that we use across Google to power our innovation. Whether it’s launching an app seamlessly with Google App Engine or using our Cloud Machine Learning tools, including the popular Cloud Natural Language API or Cloud Vision API, you can incorporate Google’s state-of-the-art image recognition capabilities into web apps.

Computer science professors in certain European Union countries, the UK, Israel, Switzerland and Canada can apply here for Education Grants. Others interested in GCP for Higher Education should complete this form to stay up to date with the latest from Google Cloud.

We look forward to seeing the new ways professors and students will use their GCP Education Grants. We’ll share stories about cool projects on this blog and our social channels.

From: http://feedproxy.google.com/~r/blogspot/MKuf/~3/2jHxuvL3od8/

Our focus on pay equity

Category: Google | Apr 10, 2017

Pay equity is a huge issue, not just for Silicon Valley companies, but across every industry in every country.

It’s very important to us that men and women who join Google in the same role are compensated on a level playing field, when they start and throughout their careers here.

That’s why, in the hopes of encouraging a broader conversation around the pay gap – and how companies can fight it – we shared our top-level analysis publicly in 2016. Google conducts rigorous, annual analyses so that our pay practices remain aligned with our commitment to equal pay practices.

So we were quite surprised when a representative of the Office of Federal Contract Compliance Programs at the U.S. Department of Labor (OFCCP) accused us of not compensating women fairly.  We were taken aback by this assertion, which came without any supporting data or methodology.  The OFCCP representative claimed to have reached this conclusion even as the OFCCP is seeking thousands of employee records, including contact details of our employees, in addition to the hundreds of thousands of documents we’ve already produced in response to 18 different document requests.

The fact is that our annual analysis is extremely scientific and robust. It relies on the same confidence interval that is used in medical testing (>95%).  And we have made the methodology available to other businesses who want to test their own compensation practices for equal pay.

So how does it work?
In short, each year, we suggest an amount for every employee’s new compensation (consisting of base salary, bonus and equity) based on role, job level, job location as well as current and recent performance ratings.  This suggested amount is “blind” to gender; the analysts who calculate the suggested amounts do not have access to employees’ gender data. An employee’s manager has limited discretion to adjust the suggested amount, providing they cite a legitimate adjustment rationale.

Our pay equity model then looks at employees in the same job categories, and analyzes their compensation to confirm that the adjusted amount shows no statistically significant differences between men’s and women’s compensation.

equalpay_action_keyword.jpg

In late 2016, we performed our most recent analysis across 52 different, major job categories, and found no gender pay gap.  Nevertheless, if individual employees are concerned, or think there are unique factors at play, or want a more individualized assessment, we dive deeper and make any appropriate corrections.

Our analysis gives us confidence that there is no gender pay gap at Google.  In fact, we recently expanded the analysis to cover race in the US.

We hope to work with the OFCCP to resolve this issue, and to help in its mission to improve equal pay across federal contractors.  And we look forward to demonstrating the robustness of Google’s approach to equal pay.

From: http://feedproxy.google.com/~r/blogspot/MKuf/~3/GXhuCNwOb8I/

Our focus on pay equity

Category: Google | Apr 10, 2017

Pay equity is a huge issue, not just for Silicon Valley companies, but across every industry in every country.

It’s very important to us that men and women who join Google in the same role are compensated on a level playing field, when they start and throughout their careers here.

That’s why, in the hopes of encouraging a broader conversation around the pay gap – and how companies can fight it – we shared our top-level analysis publicly in 2016. Google conducts rigorous, annual analyses so that our pay practices remain aligned with our commitment to equal pay practices.

So we were quite surprised when a representative of the Office of Federal Contract Compliance Programs at the U.S. Department of Labor (OFCCP) accused us of not compensating women fairly.  We were taken aback by this assertion, which came without any supporting data or methodology.  The OFCCP representative claimed to have reached this conclusion even as the OFCCP is seeking thousands of employee records, including contact details of our employees, in addition to the hundreds of thousands of documents we’ve already produced in response to 18 different document requests.

The fact is that our annual analysis is extremely scientific and robust. It relies on the same confidence interval that is used in medical testing (>95%).  And we have made the methodology available to other businesses who want to test their own compensation practices for equal pay.

So how does it work?

In short, each year, we suggest an amount for every employee’s new compensation (consisting of base salary, bonus and equity) based on role, job level, job location as well as current and recent performance ratings.  This suggested amount is “blind” to gender; the analysts who calculate the suggested amounts do not have access to employees’ gender data. An employee’s manager has limited discretion to adjust the suggested amount, providing they cite a legitimate adjustment rationale.

Our pay equity model then looks at employees in the same job categories, and analyzes their compensation to confirm that the adjusted amount shows no statistically significant differences between men’s and women’s compensation.

equalpay_action_keyword.jpg

In late 2016, we performed our most recent analysis across 52 different, major job categories, and found no gender pay gap. Nevertheless, if individual employees are concerned, or think there are unique factors at play, or want a more individualized assessment, we dive deeper and make any appropriate corrections.

Our analysis gives us confidence that there is no gender pay gap at Google.  In fact, we recently expanded the analysis to cover race in the US.

We hope to work with the OFCCP to resolve this issue, and to help in its mission to improve equal pay across federal contractors.  And we look forward to demonstrating the robustness of Google’s approach to equal pay.

From: http://feedproxy.google.com/~r/blogspot/MKuf/~3/YGnoziA9-Pw/

With these Pixel tips, your photos will bloom bright

Category: Google | Apr 10, 2017

Spring has sprung, and so have desert wildflowers, daffodils along the highway, and even the tulips you picked up at your florist. Everything seems to be in bloom—and everyone is sharing photos of flowers on social media.

Your petal pics should be the best of the bunch. So with help from Brittany Asch of BRRCH Floral, we’ve gathered a few tips to help you take, store and share the prettiest photos of poppies, petunias, phlox or whatever blossoms you love best—with help from your Pixel, Phone by Google, of course!

  1. Lighting is key. Check your exposure to ensure you’re giving your plants and flowers the right amount of light to see the details. On Pixel, you can easily brighten or darken your shot to get the perfect amount of exposure. Tap the camera, the slide your finger up or down to adjust the exposure depending on the lighting conditions.
  2. Consider your frame. Take note of the surrounding area (is it worthy of the ‘gram?). If it’s not, zoom in. If it is, capture a wider angle that shows the full view. On Pixel, you can use a compositional tool to help frame your shot. Look for the grid in the top right of the camera—your arrangement should be at the center of the grid. We recommend 3 x 3 as a ratio for flowers.
  3. That Lens Blur, tho. This Pixel feature will make your photos look professional. Start by clicking on the menu bar at the top left; you’ll see the Lens Blur option second from the bottom. Click on it, take your photo and slowly raise your phone to capture the perfect lens blur. After you snap a photo you can also edit the placement and degree of lens blur. This is a great for up-close shots!
  4. Grab a friend. Have him or her pose with your flowers to bring them to life and add depth to the photo.
  5. Take as many as you want! With Pixel, you get unlimited high-quality storage with Google Photos for free, so you can try out as many shots as you need without worrying about filling up your phone with dud buds.
  6. Find ‘em later. Just type “flowers” into the search bar in Google Photos for a bouquet of photos just waiting to be shared.

2017-04-05_Google_0756.jpg

Photo credit: Dorothy Hong

2017-04-05_Google_0392.jpg

Photo credit: Dorothy Hong

2017-04-05_Google_0690.jpg

Photo credit: Dorothy Hong

2017-04-05_Google_1151.jpg

Photo credit: Dorothy Hong

IMG_20170405_182457.jpg


Don’t forget to share your snapshots with #teampixel for a chance to be featured on our Instagram account!

From: http://feedproxy.google.com/~r/blogspot/MKuf/~3/_Lasy0u3YQ0/

Say bees! The buzz about Cloud Vision API

Category: Google | Apr 10, 2017

What do declining bee populations and machine learning have in common? Natural personal care brand Burt’s Bees hopes to plant 2 billion bee-nourishing wildflowers through its latest Bring Back the Bees campaign, and has enlisted the Cloud Vision API to help.

For every “selfless selfie” that bee lovers create, Burt’s Bees will plant 5,000 wildflower seeds in its home state of North Carolina, in hopes of restoring the furry pollinators’ habitat and food supply. The Burt’s Bees mobile-optimized site takes the selfie, overlays pictures of wildflowers onto it and encourages people to spread the word by sharing the image back to social media.  

burtsbees-1

But before Burt’s Bees can apply the wildflower filter, Cloud Vision API first analyzes the image to make sure it’s a good fit. Its image recognition capabilities detect whether the image is in fact of a single face, and whether the face is appropriately centered in the frame. It also determines where on the frame the filter can add wildflowers, so that no one’s face is covered.

When done correctly, planting wildflowers can be a simple way to help declining worldwide bee populations. Likewise, using the Cloud Vision API is a simple approach to the difficult task of machine-assisted image recognition. Cloud Vision API provides developers with a drop-dead easy way to access the state-of-the-art machine learning models that Google artificial intelligence researchers have developed over decades. Cloud Vision API can easily detect objects, people, landmarks, logos and text, and describes those attributes in a web-friendly JSON format. Burt’s Bees #SelflessSelfie is just one—albeit very sweet—example of what Cloud Vision API can do.

To upload your own selfless selfie, visit selflessselfie.burtsbees.com, and to learn more about Cloud Vision API and other services, visit our Cloud Machine Learning page.

From: http://feedproxy.google.com/~r/blogspot/MKuf/~3/PKQLOub_aRo/