Today, we are excited to announce a new scholarship program with Bertelsmann. Over the next three years, Bertelsmann and Udacity will provide up to 50,000 scholarships in the areas of Cloud Engineering, Data Science and Artificial intelligence. This effort is an expansion of Udacity and Bertelmann’s partnership, as well as, their joint efforts to provide enhanced learning opportunities in emerging technologies.
The program is structured in two phases: In the first phase, 15,000 applicants, per subject area, will be selected to participate in a 3-month Scholarship Challenge phase. In the second phase, the top 5,000 performing Challenge phase students in each subject area will be awarded a full scholarship for a Udacity Nanodegree program.
Machine learning is impacting countless industries, from the recent discovery of a black hole to improving healthcare, we are just scratching the surface. The retail industry is a prime example. Retailers and manufacturers are racing to figure out how they can employ machine learning to target specific consumers, monitor trends, and discover new pricing models.
While retailers and manufacturers are doubling down on new ways to target and sell to consumers, Jia Rui Ong, a two-time Nanodegree program graduate, and his team are employing machine learning to help you, the consumer, find the best price for the clothing you desire.
We recently had a chance to sit down with Jia Rui Ong and his team at Yux to discuss their product, as well as, our newly updated Machine Learning Nanodegree program.
We’re working with Amazon Web Services (AWS) and their AWS Educate program to teach you how to deploy machine learning models using Amazon SageMaker.
Over the past few years, the demand for machine learning specialists and engineers has soared, with machine learning engineers and specialists ranking amongst the top emerging jobs on LinkedIn. Recently, machine learning has been adopted by a wide range of industries, including medical diagnostic companies, finance firms, and more. Udacity’s Intro to Machine Learning Nanodegree program and Machine Learning Engineer Nanodegree program were built in response to this demand to provide access to this growing tech field.
We’ve seen advances in research and industry practices as more companies look to build machine learning products. Specifically, there is a growing demand for engineers who are able to deploy machine learning models to a global audience. Deployment means making a model available for use in a piece of hardware or web application, such as a voice assistant or recommendation engine. Knowing how to build machine learning models is a great starting point, but to truly make an impact at scale, a data scientist or programmer needs to know the techniques and tools to deploy that model so that it’s highly accessible.
To keep up with this advancement and bring the best educational experience to our students, we are updating the Machine Learning Engineer Nanodegree program to include the latest skills by adding two new projects focused on deployment skills.
In a continuation of celebrating this year’s Women’s History Month, we would like to introduce Ayşin Taşdelen, an artificial intelligence professional and three-time Nanodegree program graduate. She has let her curiosity and desire for new skills lead her through three Nanodegree programs, new jobs, and side projects.
We recently had a chance to speak with Ayşin to hear about her motivations and interest in pursuing cutting-edge technologies.
You studied mathematics during your university years and then became a programmer, what were some of your initial career goals?
I really enjoyed my university studies, so much so, that I initially looked into becoming a full-time researcher. Leaving academia was a tough decision, I loved learning but also knew that starting a traditional career would help me financially. I decided to go the career route and follow my interest in computer science. My initial career goal was to land a job and improve my programming skills.
As your career has developed, how have you satiated your desire to learn?
Over the years, I have tried to keep up with industry articles and books about the latest computer and tech trends. As the internet surged, I started using online library subscriptions and video learning paths. Reading and watching videos were great, but they only get you so far; I never felt like I was learning enough about a subject or concept, until, Udacity.
In 2016, Udacity released the very first free course on TensorFlow in collaboration with Google. Since then, over 400,000 students have enrolled in the course and joined the AI revolution. We’re excited to release an all-new version of this free course featuring the just-announced alpha release of TensorFlow 2.0: Intro to TensorFlow for Deep Learning. This update makes AI even more accessible to everyone, and we’ve again worked directly with the deep learning experts at Google to ensure you’re learning the very latest skills to utilize TensorFlow.
This free course is a practical approach to deep learning for software developers. Our goal is to get you building state-of-the-art AI applications as fast as possible, without requiring a background in math. If you can code, you can build AI with TensorFlow. You’ll get hands-on experience using TensorFlow to implement state-of-the-art image classifiers and other deep learning models. You’ll also learn how to deploy your models to various environments including browsers, phones, and the cloud.
Machine Learning for Everyone
The alpha release of TensorFlow 2.0 is a big milestone for the product. TensorFlow has matured into an entire end-to-end platform. In this alpha release, TensorFlow has been redesigned with a focus on simplicity, developer productivity, and ease of use. This release integrates Keras more tightly into the rest of the TensorFlow platform so that it’s easier for developers new to machine learning to get started with TensorFlow. Along with standardizing around Keras as the main API, other deprecated and redundant APIs have been removed to reduce complexity in the framework. A general release candidate will be available later in Q2 2019.
This Udacity Nanodegree program graduate enacted a full-scale career change to become a Machine Learning Engineer.
Meet Robin Stringer. Robin worked as a journalist, a translator, and a marathon race guide for visually-impaired athletes, before a conversation about coding caused him to reevaluate his long-term career plans.
While he was working for a para-athletics non-profit in New York, he began learning Python online, and in the course of doing so discovered Udacity’s programs. He moved to Seattle and took the opportunity to pursue his coding studies full-time, with the goal of pulling off an audacious career change. After studying some of Udacity’s free courses, he started the Self-Driving Car Engineer Nanodegree program and got his first taste of machine learning. After a lot of work, he successfully landed a full-time role as a machine learning engineer.
We chatted with Robin to learn how he made his career change happen.
We talk with Dan Romuald Mbanga, Global Lead of Business Development for Amazon AI, about teaching students to use SageMaker for training and deploying deep learning models.
Deep Learning is one of the most exciting technology fields in the world today, and because Udacity’s learning platform is built to allow for maximum adaptability, our Deep Learning Nanodegree program is one of our most dynamic and future-facing programs right now, as we continue to respond to advances in the field by augmenting and enhancing our curriculum.
We are very excited to share details about the latest additions to our program curriculum, which include new content and projects focused on PyTorch and SageMaker. In a recent post by Cezanne Camacho, Curriculum Lead for Udacity’s School of Artificial Intelligence, we discussed new PyTorch content, and today, we’re going to explore how we’ll be teaching students to use SageMaker for training and deploying deep learning models.
To integrate the incredible new content, we teamed up with AWS and the SageMaker team, and in the updated program, students will train and deploy a sentiment analysis model on SageMaker, then connect it to a front end through an API using other AWS services. After deploying a model, students will also learn how to update their model to account for changes in the underlying data used to train their model—an especially valuable skill in industries that continuously collect user data.
To provide a closer look into the world of SageMaker, we spoke recently with Dan Romuald Mbanga, Global Lead of Business Development for Amazon AI, and a leader of business and technical initiatives for Amazon AI platforms.