AWS Summit ANZ 2021 - Keynote Day 2

AWS Summit ANZ 2021 - Keynote Day 2

Show Video

(ENERGETIC MUSIC PLAYS) SPEAKER: In Australia, we acknowledge the lands of the traditional custodians across the country. We recognise Aboriginal and Torres Strait Islander peoples and their continuing connection to land, water and sky. We acknowledge the hundreds of different nations across the country from the Bardi of Western Australia, the Larrakia in the top end, the Meriam Mer in the north of the Torres Straits, Yesterday, we heard from some of our amazing customers, organisations that are using AWS to accelerate the pace of transformation and innovation. Today, we're going to focus on that exciting moment when inspired builders find a meaningful problem, and using the right tools, go out and make a difference in the world.

As individuals and as a society, we all face difficult challenges: from global issues such as climate change and food supply to more specific challenges, like rapidly changing markets, competitive disruption, and rising customer expectations. At AWS, we truly believe builders play a key role in innovating to solve these problems, by optimising existing solutions or inventing new ones. Our goal is to help you use emerging technologies to build these solutions, so that you can do more of what matters to you, to your organisation, or your community. And we believe AWS has the broadest and deepest platform for quickly building robust, scalable, and secure solutions. Today, we provide more than 200 fully featured services including 12 services launched into the AWS Sydney region in just the last year. As we heard from AWS' local Managing Director in Australia and New Zealand, Adam Beavis yesterday, we see technology playing an important role in accelerating change in Australia and New Zealand.

Today, we’ll outline five digital trends that are impacting us and visit local builders who are harnessing technology to affect change within their communities. We’ll hear how cloud is extending out of data centres, meaning cloud software is now running as close as possible to where it is needed. This is improving how we can deliver solutions that improve people’s lives. A little later we'll talk about a startup in Brisbane that is bringing a data-centric approach to agriculture.

We’ve also seen cloud technology transform the way we use and manage data in the digital realm, but now we are seeing the cloud optimise how we move through the physical world. I saw this in action at one of our customers in the energy industry that is using cloud-based data to transform and optimise physical processes and create a safer working environment. More and more of that data will be pictures, video and audio. You’ll hear how these rich data types are being used by another AWS customer to improve women’s health and bring joy to couples across the world. And you’ll see how machine learning is playing a key role for this customer, enabling them and many others to process vast amounts of data, creating an Internet of Machine Learning that is helping us all solve ever more complex problems. And lastly, you’ll hear from one of the world’s foremost experts on quantum computing about how this most advanced, most complex technology will bloom as it becomes more affordable, available, and understandable to as many people as possible.

First, the cloud will be everywhere. Not only will cloud infrastructure continue to expand geographically, as illustrated by our recent announcement of our second Australian AWS Region, to be located in Melbourne, But cloud elasticity, self-service, and automation are reaching out from our AWS Regions and Availability Zones to practically everywhere. We now see cloud technologies being used everywhere; from out at sea to rural Australia and even near Earth orbit. Customers are finding novel ways to utilise AWS Local Zones and AWS Outposts, 5G, and satellite connectivity and an ever-increasing array of devices in ways we could never have imagined. To see how far the cloud has reached, I travelled to Queensland, to visit with startup Ceres Tag. Born from a deep passion for animals and the future of the agriculture industry, Ceres Tag has developed an animal monitoring platform using satellite connectivity to extend the cloud into rural Australia.

(RELAXING MUSIC PLAYS) SARAH COOK: On 4,000 square kilometres, you can appreciate we don’t get visibility of our animals like you might on a small farm. We’ll drive past the animals occasionally, and we’ll make assumptions, but I just feel like there’s so much more that we can discover and ways that we can grow and become better at what we do. QUINTON ANDERSON: Food security, population growth and a warming climate are placing enormous pressure on our agricultural systems. As the cloud expands to reach practically everywhere, innovations will emerge to help face these issues head on. One startup in Brisbane is leading the way. Can you tell me a little bit about your background? DAVID SMITH: So, I grew up on a property in the Hunter Valley.

Even though I had the opportunity to work as an engineer, all around the world, coming home to agriculture was just really where I wanted to be, and being able to bring new ideas in the way you collect data, the way you analyse data, the way you use data to make decisions, it wasn’t there in Ag like it is in other industries. QUINTON ANDERSON: Which specific problem within the industry are you solving? DAVID SMITH: The things that we wanted to know was: Where are our cattle? How many were there? And what condition are they in? QUINTON ANDERSON: Can you tell me about the Ceres Tag solution? DAVID COOK: Anyone, anywhere, anytime can go to our website and purchase the tags, have them delivered directly to their door. The farmer or the end user is then able to go out and apply them to the animals, and within minutes, the data is being transferred from the tag on the animal via satellite, directly down to our ground stations, through the AWS Cloud, back to the farmer. The farmer is then able to utilise that information, not just for on-farm solutions, but also for things like finance and insurance. QUINTON ANDERSON: The solution seems fairly simple conceptually, why do you think it hadn’t been done before? DAVID SMITH: Certainly, when we started the journey, a lot of people told us that it would be impossible to do.

And it was funny, as we went along the journey, how technology caught up with us. When you have a great team that are imaginative, that can solve these problems and work through them, we were able to generate a platform that took a complex issue and turned it into a very simple-to-use system. (LIVELY MUSIC PLAYS) HEIDI PERRETT: Having grown up on a cattle property, I’m very proud of being able to bring some of the potential of technology to the agricultural sector.

QUINTON ANDERSON: Can you tell me a little bit about the Ceres Tag solution? HEIDI PERRETT: We had to have a full constellation of low Earth orbit satellites in order for this to work. The data travels from the satellite to the ground station and then there's an ingestion and parsing process that happens before the data is stored in our IoT database. We have some APIs to manage the ingestion into our Ceres Tag management system which sits on AWS infrastructure. We’ve managed to secure it really effectively with AWS’ Cognito and also we use the RDS encrypted at rest database as well. QUINTON ANDERSON: I imagine that you've gone through a number of iterations on the device itself, can you talk through some of that evolution? HEIDI PERRETT: The first iteration was a lot larger and there were weight issues around that.

We also made improvements to the actual embedded systems in the tag. QUINTON ANDERSON: And I imagine the battery technology and just power in general is quite a complication? HEIDI PERRETT: We have a unique battery chemistry, but we also have some intelligent power management to be able to make sure that we get the most out of our battery and our solar panel while also maintaining that data transmission rate. QUINTON ANDERSON: Did you undertake any kind of penetration testing prior to launch? HEIDI PERRETT: We undertook some extensive penetration and performance testing and they were really valuable for us, making sure that an external party was able to test our system and make sure that it was robust by the time we released. QUINTON ANDERSON: What's one of the findings or issues that you had to correct based on that testing? HEIDI PERRETT: So, we have quite an extensive API set up. We did find that we had to shore up

some access issues in our API. We’re really glad that we got that done before we released. QUINTON ANDERSON: What keeps you up at night from an operational perspective? HEIDI PERRETT: Post-launch, we’ve been very busy with our customers and what keeps me up at night is being able to continue to look after those customers and grow our product with the demand.

QUINTON ANDERSON: What are some of the key considerations as you continue to scale? HEIDI PERRETT: Already we have in development other product offerings, not just for livestock. We’ve had a lot of interest in wildlife as well as companion animals. And building the relationship with the software providers, we wouldn’t have the solution and the platform that we’ve delivered if we didn’t have acknowledgement and buy-in from the existing software providers in the ecosystem. Being able to bring that to reality and enrich, enhance and uplift what’s already there makes me really excited. There’s so much out there that can be done and it’s only just starting. (MUSIC PLAYS) QUINTON ANDERSON: This story reminds us just how empowering cloud technologies can be.

You can aim to solve globally impactful problems with the right skills and your unique perspective and drive. DAVID SMITH: If we’re really going to feed the world with 10 billion people with the assets that we have, the only way we will do that is if we start adopting technology more. But all the way along the supply chain, no matter whether you’re a breeder, a backgrounder, a feed lighter, processor, a financier, an insurer, the information that Ceres Tag provides you is that securitisation of information, that certainty that gives you confidence every time you make a decision. And when you do that for such a broad set of customers along a supply chain is when you start to see whole societal change. (MUSIC PLAYS) SARAH COOK: Ceres Tag is bringing those animals into our offices, onto our phones.

We can be far more efficient about directing our resources, but we really don’t know what this is going to do for us in the bigger picture. It's exciting to be able to track the animals, but I’m also excited about what it’s going to bring us that we’re not aware of yet. (MUSIC PLAYS) QUINTON ANDERSON: Putting IoT sensors on animals in a rural setting is an extremely difficult device problem. A range of issues needed to be addressed, from the safe application of devices to animals, through to the power and connectivity issues associated with locations that have little or no communications infrastructure. Ceres Tag have invited strict external review from experts at the CSIRO to ensure that the tag design takes the health and long term impact of the animal into account.

The sensor balances information gathering with power constraints, collecting accelerometer data continuously while only gathering GPS locations and transmitting data periodically. This reduces the energy-intensive transmissions each day and meets the use case without needing to install fixed infrastructure across a large geographic area. Ceres Tag illustrates just how powerful industry collaboration can be. The solution

brings together their team of engineers based in Brisbane and New Zealand, research organisations such as CSIRO and Data61, technology providers like AWS, and connectivity partners, to deliver both the basis for a cattle management solution and the beginnings of a supply chain data platform with endless potential. The Ceres Tag platform has global applications, with multi AWS Region deployments and satellite communications using AWS Ground Station and a platform of APIs deployed into containers, with identities managed by Amazon Cognito. Beyond Ceres Tag, AWS is enabling a range of other use cases through device and integration collaborations. AWS IoT Core for LoRaWAN is a fully managed service that enables customers to connect devices that use the LoRaWAN low-power, long-range wide area network protocol to AWS. Low energy sensors can now be deployed into low infrastructure environments and seamlessly integrated into AWS to provide operational awareness and control. And AWS IoT Greengrass brings local compute, messaging, data management, sync, and ML inference capabilities to devices.

This brings us to our second trend. Robotics platforms, sensors and machine learning have now reached maturity, making them accessible to a much larger builder community. One local Australian organisation is using cloud technologies to improve the safety and efficiency of workers in remote environments.

I travelled to Western Australia to speak with Woodside about taking robotics from the lab to the field. (MUSIC PLAYS) The digital world is becoming ever more connected to the physical world. Progressing rapidly towards autonomous robots and augmented reality, helping tackle some of our most daunting challenges, and improving our quality of life.

(MUSIC PLAYS) There are a growing number of use cases in which simplifying assumptions make robotics key enablers today. Woodside operated 6% of global LNG supply in 2020 and they are pushing robotics forward both in the lab and the field. MARK MICIRE: Woodside was looking at ML and AI and all of the kind of technologies that were coming into maturity and realised that there was a real opportunity for us to leverage robots as a mobility platform to take some of the sensor gathering and other things out into the field and to gather data that way. QUINTON ANDERSON: Can you talk through the pipeline infrastructure and the particular pipelines that you’ve landed on today? ROBERT REID: So if we’re doing code development, then we’ll be doing that on a development robot out here in the Carter Lab.

Once we're happy with some of the code changes we’re making we’ll be pushing them up to GitHub where the CICD processors will kick off and build those changes into fresh dev-in packages. We have a staging robot also out in the carpark so once those packages have been built up into a new Docker image, we’ll pull it down onto the staging robot and we’ll spend multiple days actually testing. And once we’re happy that the robot is performing as expected then we’ll actually push that image to the production robot which is sitting up in Karratha right now. (MUSIC PLAYS) QUINTON ANDERSON: Woodside are conducting robotics trials with a team on the ground in Karratha, 1,500 kilometres to the north of their Perth-based lab. We’re able to speak to them via their remote op’s facility. Can you give us a description of what your typical deployment looks like up there? DELENE JONES: We come down to what you can see here is our Bot Box then we can proceed to take the robots out.

We will then deploy that around the ETP area, which is where we work in at Pluto and that basically contains a little map or little path that the robot takes, so that deploys around our little plant, captures some data for us, takes some images and then comes back and basically redocks itself on its little docking station here back in the Bot Box. QUINTON ANDERSON: Can you tell me about the software architecture that you have in your stack? ROBERT REID: ROS is the glue that really brings the various parts of the robot together. We have a range of sensors and their device drivers, and ROS allows us to take the data from each of those sensors, brings them together with a range of algorithms such as localisation, obstacle detection, navigation, and it also allows us to encode the images as video, for example, so that we can push that data up to the cloud.

We bring all of those various components together through our CICD pipeline that is running in AWS services. QUINTON ANDERSON: Moving from the lab out into the wild is quite a large change. Can you talk to some of the challenges that you and your team have had to solve? DELENE JONES: Yeah, well you can imagine robots in a lab environment work pretty differently to out in the field environment. We had a few problems with communication issues, so we had to really work on that communication infrastructure to try and help us keep that connection with the robot all the time.

A couple of the other issues we had was localisation, so the robots would sometimes get a little bit confused about where they were in their world, in their map. And the other thing we had at sort of the start in the early days: thermal issues, so we found the robots would overheat. MARK MICIRE: Frankly for a lot of this equipment, it’s lab equipment that we’re adapting to the "real world".

So we’re figuring out how it breaks, we’re actually searching for those data points that you’re only going to find after the thousandth hour of testing. And it’s those data points are the ones that we want to find out now and we want to really find them in that testing environment. That way, when we’re working in a real operational environment, we’ve already scrubbed out all of those problems. QUINTON ANDERSON: Your test environment is quite impressive, you’ve built a lot of the actual physical environment into your test lab. Can you tell me a little bit about how you think about testing? ROBERT REID: We have the regression tests, which we’re executing every time we kick off the CICD pipelines.

We have some simulations that we’re running also as a part of those CICD pipelines. MARK MICIRE: For the, let’s say, thousands of hours that we do out in the field, you can put hundreds of thousands of hours in simulation. Our plant doesn’t change a lot. So unlike autonomous vehicles and other robots that are in very dynamic environments, we can cheat a little bit and go through and generate a point cloud ahead of time that’s got a centimetre, millimetre accuracy. It really gives us some advantages we’re trying to make plans and to tell the robots how and what we want them to go through the world and do.

(UPBEAT ELECTRONIC MUSIC) QUINTON ANDERSON: The team in the lab have worked with AWS for a number of years now. Can you tell me about what that relationship means between your team, and the platform and AWS? MARK MICIRE: I would say the biggest thing there is the idea that historically robots and the way that they capture and store and then process data, each one has been a bespoke solution. One of the things that we immediately recognised in leveraging AWS is the way that we manage the robots, the way that we do the data stores, the way that that information can then be ingested into our digital twins and then be used. The robots are not unique in their ability to go gather data, and so what AWS does is allow us to work in that same ecosystem that we’re using for all of the rest of the digital infrastructure that we’re building. (INSPIRING MUSIC) QUINTON ANDERSON: So your team’s made some amazing progress, what’s next? ROBERT REID: We have the robots out in the field being deployed right now.

At the moment, when they’re driving around, we have a safety operator just checking in to make sure that they are doing everything correctly. Our goal long term is to take that safety operator out of the field and get ourselves to a place where we can actually do remote operations. MARK MICIRE: And then the longer-term, blue-sky stuff that we’re working is more related to the robot’s ability to manipulate things in the environment. We do see a future in which a robot is able to go and affect things in the world as opposed to just moving through the world and monitoring it.

(UPBEAT MUSIC) QUINTON ANDERSON: When you look back years from now, what are you going to be most proud of? What are you going to look back on and really hold dear? DELENE JONES: It’s pretty awesome to be able to say we work for a company that is really trying to lead the world in trying to get this technology out. So I think to be a part of that now at the forefront, I think that’s going to be something to be really proud of. QUINTON ANDERSON: These builders are solving meaningful problems through the integration of the digital and physical worlds, underpinned by a range of sensor data, a broad range of data management capabilities and a mature software delivery practice. Their commitment to being at the forefront of this revolution is indeed something to be proud of. This is an awesome example of how an engineering mindset can be applied to a large, complex problem.

By reducing the number of variables that sit outside the system and thus, simplifying the underlying solution. The Woodside team undertook a comprehensive surveying process to create detailed 3D point clouds of their facilities which the robots use for localisation and planning purposes. This approach to simplifying the problem space is becoming more common in industrial settings and opens up a world of possibilities elsewhere.

There are of course many challenges with setting up these data sets, including raw data collection and survey work. But there is also a growing number of AWS services and specialist partners who can help with integration, annotation, and labelling of the data. For instance, Amazon SageMaker Ground Truth makes it easy to label training data for machine learning datasets at scale, enabling sensor, image and video to be fused using various labelling tasks to identify and track objects and perform semantic segmentation.

And in June last year we announced support to label 3D point clouds. Tasks can be distributed to an internal workforce, or outsourced to an AWS Partner through AWS Marketplace, giving you the tools you need to model your target environment. Another reason that the Woodside example is so inspiring is how the team is working through the difficulties of moving from the lab out into the real world. It was important to the team that they continue to be roboticists, and so removing the undifferentiated heavy lifting was a big focus. At the same time the team recognised the importance of creating a robust build pipeline and deployment process.

AWS CodePipeline and AWS IoT Greengrass have given the builders at Woodside the tools they need to iterate as they run into the next real-world problem. The Woodside team stressed the importance of long-term testing and simulation. As the cloud expands to new places, and the digital and physical world become more integrated, reliability increasingly becomes important. As our CTO Werner Vogels loves to say “Failures are a given”, which of course means that we need to build with that reality in mind and find novel ways to improve the reliability of our solutions. We recently announced AWS Fault Injection Simulator, which is a fully managed service for running fault injection experiments on AWS that makes it easy to improve your performance, observability, and resiliency.

Finally, the Woodside story brings into sharp focus just how useful geospatially tagged data is, be it labelled point clouds, or IoT sensor data with geo-tagging, or data sourced from robot as it navigates through the environment. The Woodside teams have created a complete digital representation of their facilities - what they call a "digital twin". In a sense, the robots become mobile IoT sensors, acting like an extension of their cloud environment and a source of rich data to be used in operational processes such as monitoring, or to form the basis for other use cases. Data services such as Amazon Kinesis can be used to process streaming data from IoT sensors, cameras and LiDAR. Data labelling and Processing can be achieved through Amazon SageMaker Ground Truth and Amazon SageMaker Processing, and integrated using AWS Glue.

If you'd like to get started with robotics, please attend the session later today entitled 'Building autonomous robots for everyday tasks with AWS RoboMaker'. Our third trend is the prevalence of pictures, video and audio. Our customers have always processed a wide range of data types, but we're now reaching a tipping point where pictures, voice, and video will become the dominant data types that we process. The building blocks have now matured to make rich media processing tenable for all builders, and we see the use cases increasing significantly. With the increase in data, we’re also seeing our fourth trend, the emergence of The Internet of Machine Learning. Like the Internet of Things, where the Internet became connected to all things in our world, machine learning will expand to be able to touch every part of the Internet.

It will be embedded in everything we do, from IoT, to business process optimisation, and even software delivery. This will not only broaden the specific problems we solve, but it will improve how all problems are solved. And AWS is here to support your use cases, with Amazon SageMaker and our suite of machine learning services.

Amazon SageMaker is amongst the fastest-growing services in AWS history, and the pace of innovation in machine learning is only increasing with more than 250 new features launched just this year. Typically, we think about image processing in the context of what we can see, such as autonomous vehicles or security applications. But image processing is being used for a much broader range of use cases. I visited with Presagen, an Australian startup based out of Adelaide, that has built a global data platform using novel machine learning approaches and image recognition to help address a range of important health issues for women.

The Internet of Machine Learning is opening new possibilities for image, video and voice processing. This is having a transformative impact on many industries including healthcare. I'm meeting with the team from Presagen, who are making groundbreaking advancements using image recognition.

So your current venture is Presagen, can you tell me a little bit about that? MICHELLE PERUGINI: The work that we do in women's health is incredibly inspiring, it's about improving women's healthcare outcomes at global scale using technology. Women's health has been somewhat forgotten. Having them benefit from the technology that we've developed and changing what is quite a stressful and traumatic process for them and making that easier is just such an amazing opportunity.

I tried to have children, and it took me around three years to do that. I tried a whole range of different treatments, many of which did not work. My own personal experience with infertility has really driven me to help improve women's healthcare outcomes.

Any way that we can use technology to improve that process for them is an amazing advantage. QUINTON ANDERSON: Can you tell me a little bit about your background? MICHELLE PERUGINI: I've always had a passion for healthcare and the inner workings of the human body. And so, I started off in academia in stem cell biology and genetics and found that really fascinating, but I lost the connection to how that could be utilised to actually improve real world healthcare outcomes, and that's what I'm passionate about. I started to make my transition out of science in 2007, when I had the opportunity to build my first AI tech company. It was a really exciting time for me, because it allowed me to explore this whole new world of technology. QUINTON ANDERSON: Can you tell me how your current venture, Presagen, came about? MICHELLE PERUGINI: I was mentoring Jonathan Hall through a university commercialisation program, and he came to me with this idea about using computer vision to assess quality of embryos.

And I personally connected to this problem, because of my own fertility struggles as well as my background in stem cell biology, and we immediately connected on this idea. JONATHAN HALL: Meeting Michelle really allowed me to feel that we could bring something like this into a reality. And as we talked about the idea and how that could become a real product in a company, we were really excited about it. MICHELLE PERUGINI: I immediately connected to this concept, because my husband Don and I had previously built a global AI technology business. DON PERUGINI: I remember the day she came home, saying, wow, this guy is doing some really cool stuff with computer vision, looking at embryos.

So, I was actually excited about it, because we can actually build a platform that can apply AI to healthcare problems for patients around the world. So, that's kind of where it started. QUINTON ANDERSON: Presagen's products are focused on tackling women's health challenges. Can you tell me why? JONATHAN HALL: There's a very big trend in the industry where female-oriented technologies are not being focused on. It's an area that really needs a lot more focus. MICHELLE PERUGINI: Women's healthcare has really been underserved, particularly by technology, historically.

And FemTech is kind of an emerging field that is trying to solve that problem and trying to improve healthcare outcomes for women. QUINTON ANDERSON: The business model is partially based on a social network for healthcare. Can you tell me about that portion of the business model? MICHELLE PERUGINI: So, what we're trying to do is build a global connected network of clinics so that we can access that medical data in order to create products that they can then use to service their patients.

The clinics become the data contributors to our products, we build those products, we regulate them, and then we funnel them back out through that clinic network out to patients who are the beneficiaries. DON PERUGINI: Some of those clinics will actually be content creators. So, they will actually provide medical data in order to help us create these products and in return receive royalties. What the patients get are better healthcare outcomes, and what that is really doing is fundamentally changing the way medical data is connected from around the world in order to build these highly scalable medical products for all patients around the world. So, it's both accessible and affordable.

QUINTON ANDERSON: There's obviously a number of challenges involved in building a global platform. Can you tell us about some of the challenges you had to solve? DON PERUGINI: There's data privacy laws in countries around the world that state that medical data cannot leave the country of origin. Now, that creates challenges for machine learning because typically you would move that data to a central location in order to train your machine learning models.

So, what we had to do is using the global cloud infrastructure, which ensures the data stays local, safe and secure, we created federated machine learning algorithms that can train on that data distributed all throughout the world without having to move or see that medical data. In terms of data quality, what we found was a lot of medical data was inherently poor quality. So, it's not about more data, it's about using the right data. So, the right data is about making sure you have a globally diverse data set, getting representation from different types of patient demographics in clinical settings, and also making sure you have good quality data and data that don't have errors in it. JONATHAN HALL: So, we started to work with these partners and collect this dataset and then also curated that dataset down whilst we understood the kind of data that we were putting in there.

That ended up being a core technology that we patented for data cleansing, which has been one of the key business successes that we've had at Presagen so far. QUINTON ANDERSON: Can you tell me about your ML infrastructure? JONATHAN HALL: When we're doing machine learning, we use Amazon EC2 instances and they're very flexible. We use GPU instances because they're very powerful and have a lot of memory. When we're dealing with privacy concerns and special data needs, we also have models that are quite heavy duty. And so, we found that Amazon EC2 is the best balance of flexibility and power. QUINTON ANDERSON: I understand you were part of the AWS Activate program.

Can you tell me about your experience? DON PERUGINI: So, we had a great experience with the AWS Activate program. The challenge with startups is they're really lacking the funding and the technical expertise to build their product. So, AWS provided both of them to ensure that we can build a world-class product, a global product, that we can deliver around the world. QUINTON ANDERSON: So, Presagen has been in market with Life Whisperer for just over a year now.

Can you tell me about some of the results that you've seen? MICHELLE PERUGINI: We launched our Life Whisperer product in early 2020, and in October we had the news of the first baby being born by virtue of having used our technology. At that point, it really becomes real, because you can see the real world benefit that the technology has had for those patients through that process of IVF. QUINTON ANDERSON: And for those patients, it's truly meaningful.

MICHELLE PERUGINI: It really is. I mean, the first patient that became pregnant had multiple repeat failed cycles prior to that cycle where they used Life Whisperer, and we like to think that Life Whisperer had a part to play in creating their family. JONATHAN HALL: The patients are going through what is usually a very stressful experience.

After that, they might not necessarily have a baby at the end of it, with success rates about 20% or less across the industry. So, even providing the smallest percentage increase in chance to pregnancy is of immense value. And our product produces approximately 20% uplift in terms of chance to pregnancy, which is really valuable to help so many people in this way. QUINTON ANDERSON: So, where do you see Presagen going in the next three years? MICHELLE PERUGINI: Because we built this platform now, we can apply it to a whole range of different use cases and we are interested, initially, in women's health, and we have a whole range of products that we are looking at in the women's health space. But this platform can really be used to create products for any application area within healthcare. JONATHAN HALL: We realised we had this really powerful capability.

We could use data and productionise it very quickly, and are already doing that with our second product. We have three or four more products on our roadmap to be able to use medical data and computer vision to be able to quickly productionise an ML application for a medical problem. DON PERUGINI: Our vision is to have the largest global network of clinics and medical data from around the world that allows patients better healthcare outcomes that are both accessible and affordable. QUINTON ANDERSON: I'm constantly inspired by the breadth of problems our customers solve using machine learning.

Presagen is a great example of novel use of machine learning. Presagen has created a vision to connect a global network of clinics that utilises artificial intelligence to address global healthcare issues. Many critical lessons stand out from this customer story. Firstly, the cost of model training is important. The dynamics of compute costs change significantly as your business becomes machine learning centric. AWS can support your efforts through a range of cost management options.

From low cost Spot Instances, through to higher order services within Amazon SageMaker, to ensure your modelling processes are cost optimised. As Peter DeSantis, Senior Vice President of AWS Infrastructure and Customer Support, told us yesterday, the first tenet of the Amazon EC2 service is, be the price performance leader. Secondly, don't underestimate the value of having a range of computer options at your fingertips. With the cloud, you have the tools and framework abstractions to manage complexity. Therefore, you can avoid the need to compromise, and leverage the right tool for the right job, no matter where your experiments take you.

After all, the second tenet of Amazon EC2 service is, build a flexible platform that supports the breadth and depth of customer use cases. From the fastest 400GBPS networking instances, through to most powerful ML training instances, and the lowest cost inference instances, we are also the only cloud provider who has support for Intel, AMD and Arm processors. If you'd like to dive into machine learning, we have some great learning opportunities throughout the day. I would recommend 'Shift your ML model into overdrive with AWS DeepRacer', and 'Enrich your serverless application with AWS artificial intelligence services'. Like Presagen, many of you operate within regulatory frameworks. At AWS, security is our highest priority and our customers benefit from using world class infrastructure, privacy and security protections.

We also believe it's critical that organisations understand the best and most secure ways to use the cloud and have the right security posture and processes in place to maintain the highest security standards. We expect to see cloud adoption continue to accelerate as organisations of all sizes realise the agility, operational, financial, and innovation advantages of moving to the cloud. For organisations to fully harness the benefits, it's important they remain vigilant on the security of systems and protect the privacy of the information they store. To that end, I recommend that you undertake a Well-Architected Review across all five pillars of operational excellence, security, reliability, performance efficiency and cost optimisation and start experimenting with the foundational services that we provide to help you stay safe and secure. Start by hardening your accounts using AWS Organizations, AWS Control Tower, AWS Config and AWS Security Hub. Ensure that you have layered protection through your network, identity and crypto architecture and ensure that you have the ability to detect and respond to incidents.

If you are interested in learning more, please join our session later Using the Well-Architected Framework to secure your AWS environment. There's one last technology that I would like to talk to: quantum computing. Although still early days, we will see quantum computing used to solve an increasing number of real-world use cases in the near future. Quantum computing has the ability to deliver unprecedented computational capabilities.

This will help us solve certain types of problems that are difficult or impossible to solve classically. Today, you have access to simulated and real qubits, but this field is evolving rapidly with a number of possible approaches. To help us understand these, we've invited one of the world's leading experts on quantum computing. Professor Michelle Simmons is a Scientia Professor of Quantum Physics in the Faculty of Science at the University of New South Wales.

She has twice been the Australian Research Council Federation Fellow, and is an Australian Research Council Laureate Fellow, and her groundbreaking work led to her being named Australian of the Year for 2018. Please welcome Michelle to AWS Summit Online. PROF MICHELLE SIMMONS: As the demand for computing power grows, organisations across the world have begun to try and build a new kind of computer.

This computer seeks to harness the power of quantum physics to perform high-value complex calculations in minutes that might otherwise take thousands of years. We call it a quantum computer, and so far, we've dreamed up at least seven different ways to build one. You can do it using ion traps, where you trap ions in a ultra high vacuum system using a laser beam. You can do it using superconducting qubits, where the qubit is embodied in the magnetic flux in the loop of a superconducting material. You can potentially do it using photons of light integrated on an optical chip.

There are three more radical approaches some people are trying to make qubits with, using defects in diamonds, other use a variation of ion traps using neutral atoms and some have even set out to build them on a theoretically proposed particle called a Majorana fermion. I am doing it using atoms in silicon. Literally encoding information on individual electrons that reside on atoms in silicon and using voltages, very much like conventional electronics, to control the qubit states. Here's why I think atom qubits in silicon is a great way to go.

To make a quantum computer work, you need speed, stability and the ability to scale. So for speed, given a quantum computer is a statistical machine, you need to be able to input, read out and perform operations at lightning speeds, as you must repeat many calculations before losing the quantum state. Stability - the flipside of this is stability. Quantum states are more fragile than classical states, so if you want to use the processing power of quantum mechanics, you need to be able to hold and precisely control a quantum state for a long enough time to be able to perform calculations on it.

More broadly, devices need to be stable and reproducible. Ultimately, you need to be able to make qubits reproducible knowing exactly what you've been able to make and how you've made it. Ability to scale. You need to have a system that scales. This means you need to be able to manage and control a large number of qubits in very close proximity with the ability to turn them on and off. You also need a system that allows for error correction, which requires complex architectures so you can address multiple qubits at the same time and synchronously.

This requires, ultimately, the ability to realise precision fabrication in all three dimensions. Finally, the overall system has to be implemented on a practical size scale. You can't have a system that takes up the whole of the Nullarbor Plain or occupies half a city. So when I look at all these requirements, I think the most sensible approach to building a quantum computer is to go atomic and to work in silicon.

Speed - atomic-scale qubits are necessarily very small and very close to one another, so they're going to be fast. The mere presence of an atom, with its sharp potential profile, creates the qubit, so you do not need additional electrodes or other materials nearby. This allows us to turn on and off the qubit with high speed and high accuracy.

Stability - due to their small size, atomic qubits will also interact a lot less with the environment, giving them long coherence times. In other words, enabling us to hold onto the quantum state for a long period of time. A quantum computer made in silicon would also allow us to make use of the solid-state material for manufacturing, but one that is crystalline and uniform with minimal interfaces and defects, and therefore, extreme stability.

Ability to scale - silicon is highly manufacturable and allows qubits to be packed very closely with integrated nanoscale control electrodes. Moreover, with atomic qubits, we can pack millions of qubits onto a small chip that still fits inside the operating environment of our computer. Quantum integrated circuits can be realised in 3D, allowing us to do error correction. So these are not just ideas - we've actually been able to implement them, we have proved them in practice. Over the last 20 years, we have pioneered a disruptive, world-leading technology to create and control electronic devices in silicon in which the active components are individual atoms.

The foundation of this process is the scanning tunnelling microscope. This is an imaging tool that won a Nobel Prize for its invention in 1984, and that was the first time anyone had really been able to see atoms. We now use this technology not just to image atoms but to move them, and we've combined this with a high-purity silicon growth technique so that we can position atoms with atomic precision and then we can encapsulate them in a beautiful protective environment to create whole devices with atomic resolution.

This has enabled us to engineer the world's smallest transistor, a classical transistor built of a single atom 10 years ahead of industry predictions. It's also allowed us to create the world's narrowest conducting wires, just a few atoms wide with the same current-carrying capability that copper has. Not only can we position these atoms in silicon with atomic precision, but we can actually see the electron that sits on the atom, and we can measure its wave function directly. This is really like seeing into the qubit. All of this has opened the door to realising the benefits of silicon for a quantum computer. Using our atomic manipulation technologies, we've created all the core components of a quantum computer.

We've shown that we have the ability to read out a single electron on a single atom, telling us the state of the qubit. We've been able to do this as fast as microseconds to nanoseconds timeframes with very high accuracy, indeed, at an industry-leading 99.8% fidelity. We've measured the fastest 2-qubit gate in silicon, where we can swap information between two adjacent atoms in just 0.8 nanoseconds. We have shown that by building our atom qubits in crystalline silicon, moving away from surfaces and interfaces with different materials, that they're incredibly stable - an order of magnitude more stable than other semiconducting qubits. And we've also shown that we can manufacture devices with atomic precision engineered in all three dimensions.

This is critical if we're going to realise scalable processes with the ability to both detect and correct for errors during calculations. Having this level of control has enabled us to start building a quantum computer in the solid state in practical terms. We've started this journey by manufacturing and operating quantum processors with one to 10 qubits, and we're incredibly optimistic. You see, there is a historical precedent which suggests reasons for optimism as we scale. It took 11 years to go from the first transistor back in 1947 to an integrated circuit in 1958 - both feats winning a Nobel Prize.

It was only another six years before a commercial calculator was made from this technology in 1964. This was repeated again for the CMOS technology that we use in current-day computer chips. 11 years from the invention of the CMOS transistor to an integrated circuit, and then a few more years, to a commercial computing device. Our first single atom transistor was in 2012 and, based on our work at UNSW and at Silicon Quantum Computing, we expect to have our first quantum-integrated circuit in 2023. Within five years of that, we aim to have our first commercial device. We've patented multiple architectures and device components to scale up to large-scale processors with many thousands of qubits, and we've shown how we can use our technology to patent thousands of qubits to show feasibility.

Our quantum systems are much more sophisticated and delicate than their classical forebears, yet we're tracking to exactly the same timeframe. We now have the capacity to control matter and manufacture devices at the atomic scale. This is an unprecedented capability.

It opens a unique prospect for building and realising a quantum computer in silicon. QUINTON ANDERSON: Thank you, Michelle, that was awesome. PROF MICHELLE SIMMONS: Pleasure. QUINTON ANDERSON: There's a lot of confusion at the moment about what problems quantum can and cannot solve. Can you think of any likely use cases? PROF MICHELLE SIMMONS: Yeah, look, it's amazing for me because the field has just grown exponentially in the last few years. There are more now than 60 different quantum algorithms that people have got out there, and they have lots of different area use cases, so whether it's quantum simulation, whether it's optimisation or machine learning or code breaking, I think if you had to ask me which is an example of something, I'd say a lot of people are excited about quantum simulation as being the nearest-term application, in particular, looking at a problem called the Haber-Bosch process.

So, Haber-Bosch process is a process that breaks nitrogen, which is triply bonded, into individual nitrogen atoms, and puts them as fertiliser in the ground. That process is a very high-energy process, it's very expensive to run and if you can find the right kind of catalyst, then you'll be able to make that much more efficient. And so, quantum computers are predicted to solve, how do you get the right quantum simulation to find the right catalyst that can make that energy efficient? So that's something I think the agricultural industry is very interested in.

QUINTON ANDERSON: And Michelle, do you have any advice for those who are just getting started with quantum computing? PROF MICHELLE SIMMONS: Yeah look, it's a very exciting time to get involved. I think nowadays, you can actually go and try quantum computers online, so you can access them through the cloud, I know with the AWS service. And you can actually see how they work, so you can literally program them, figure out what they do, get information back. So it's a great educational tool, to use them. I think for the companies out there, if you really want to solve a hard problem and you want to know what you need to do, you need to get in deep with those companies that are doing it and work directly with them.

But for now, those tools are fantastic to learn the system, and you need to learn now, because this wave is coming. QUINTON ANDERSON: Thank you, Michelle. That was very informative. And thanks for joining us at AWS Summit 2021. PROF MICHELLE SIMMONS: It's a pleasure, thank you. We don't yet know which approaches will succeed over time, and we aren't sure yet which algorithms are practical.

We are certain, though, that we're at the start of this wave and it's time for you to start exploring, learning, and diving deep into this aspect of computing. We have a breakout session later today where you can find out more about experimenting with Amazon Braket and quantum computing, so you can start your journey. Today, you've heard from a range of customers and builders who are integrating the digital and physical worlds, processing different types of data in novel ways, embracing machine learning and using the cloud everywhere. I want to finish by saying thank you for attending today.

We know the future depends on your ability to innovate. So I encourage you to wonder, expand your knowledge, master new services and make a meaningful difference. Let's go build! (ENERGETIC MUSIC PLAYS)

2022-01-12 08:15

Show Video

Other news