Mission-critical databases strategies and best practices

Show video

thank you for joining me for this session Mission critical Oracle databases Made Simple um my name is Ashish Ray within Oracle Product Development Group I had a product strategy and product management of a bunch of Oracle database core Mission critical capabilities and that includes well-known and Enterprise tested features such as rack active datacard maximum availability architecture and I also head up the product strategy and product management responsibilities related to Mission critical products and cloud services such as excitator recovery Appliance as part of my role and as part of my team's role we are often engaged with developers around the world so that they can develop applications Mission critical highly available highly scalable applications leveraging the latest capabilities of the Oracle database for the purpose of this session let's look into how let's pick two developers srini and Sophie how we can help srini and Sophie build Mission critical applications with the utmost Simplicity again leveraging Oracle database before we do that let's take a step back what does or what do srini and Sophie want they are building an app what kind of hair well it's just not any app the app will likely process lots of complex data like from lots of different customers from lots of different sources so there is a lot of data processing around across complex data sets and at the same time it has to do this processing very quickly okay understood what else it will have to be accessed by lots of customers and end users probably at the same time so we are looking at possibly scale here it obviously cannot go down right so there comes availability it needs to be extremely secure so maybe this is an application in the healthcare space Maybe This is a financial fintech application maybe this is an AI application so obviously along with availability scalability Securities of utmost importance and then of course it can be it should be able to deploy anywhere maybe on premises maybe Cloud maybe multi-cloud maybe hybrid Cloud so there should be a lot of deployment flexibility so these are the requirements that any Mission critical application developer May impose on a database like okay what can we do with regards to Oracle databases well that's where the database requirements for Mission critical applications come in let's kind of take a step back and go through the list that Sophie and srini presented in front of us number one complex data models number two we heard that this application has to do very fast processing it has to be accessed by lots of customers it cannot go down extremely secure and of course they should be able to deploy the app anywhere in Cloud hybrid Cloud on premises so from a code database perspective what does this translate to the mission critical database capabilities the underlying data management layer the underlying database layer has to support what we refer to as integrated data and workload Management in a very unified manner it has to be able to provide Extreme Performance completely elastic scalability which means you can add resources in a completely load balanced way with no downtime it has to support a never go down architecture so that's where High availability comes in maximum security is of utmost importance we talked about it and also the underlying database without any application changes should be able to support cloud and hybrid Cloud deployment models so having set the stage of these requirements this helps us look at a framework whenever we are talking in the context of a mission critical Oracle database this is the framework or any databases for that matter this is the framework we can evaluate the database parameters and the database attributes and then let's see to what extent Oracle database is able to address this requirements at this stage I will make out a statement the statement is Oracle database with this unified set of capabilities makes developing and deploying such demanding Mission critical application extremely and uniquely simple uniquely so what's unique about this number one the capability Suite the suite of functionalities we provide with regards to Mission critical application requirements is comprehensive it's a complete Suite of resiliency functionalities number two they are all integrated as part of the database ecosystem which means applications automatically benefit with no code changes number three platform this application suite and this database Suite can be deployed on an infrastructure that we in Oracle we have engineered to be completely optimized for database workloads hence the underlying infrastructure and the software can work together such that the combination is the whole is greater than the sum of the parts number four cloud this data management model this database model can be deployed on public Cloud hybrid Cloud multi-cloud without any application changes so complete and seamless application portability and finally show me the money right it's proven it already runs this clear this infrastructure layer already runs demanding Mission critical workloads globally so Sophie and srini should not be all by themselves when they embark on this journey several other customers Global customers have already embarked on this journey and have deployed on this layer so let's dig deeper as to what we do with regards to each of the elements of this framework that we have established number one handling complex data set that's where oracle's converged database comes in the picture there are various aspects of the converged database the number one is data model Oracle database the same unified database can support a wide variety of data models so obviously relational is the one we support graph we support average blockchain text XML Json geospatial and many others the whole idea of this is if we can support or in the case or the fact that we support all of these data models in a completely unified and integrated manner that means that any applications that Sophie and Sweeney may be building transactionally and from a query standpoint their applications can span any of these data models without going through complex API boundaries which means much more Sim much more simplified and much more seamless application development from a mission critical requirements standpoint so I talked about data models the second aspect of this handling sophisticated data set or complex data set is workload again the same theme completely integrated data management around handling all modern transactional and analytical workloads such as oltp online transaction processing iot temporal Geo distributed shorted a lot of analytics workloads again then the idea is the same application or within the application modules it can be developed such that they don't have to go through different data stores and travels different data boundaries thereby preventing any data and processing fragmentation so I talked about data models I talked about workloads in the context of converged database what is also important is the development framework again the same underlying data management the same underlying database can support a whole bunch of development Frameworks and supporting all these development Styles such as microservices events API driven kubernetes data mesh the open tier data development the traditional client here application development having all these development models integrated with the database means not only we provide unique architectural support for these modern data models but again having them in the same framework completely simplifies complex application development so we walked through the first phase first phase of the framework that is Oracle database or any Mission critical or database should be able to handle complex sophisticated data types now let's look into the second requirement that Sweeney and Sophie had and that is extremely fast processing and that's where Extreme Performance with accelerator and database in memory comes in again let's dig deeper let's talk about exadata accelerator is our proven industry proven data management platform which is completely available scalable reliable highly performant and has been installed and deployed at numerous sites around the world handling highly Mission critical data sets across various industry verticals the vision of the X of the accelerator platform is come up with an ideal Oracle database Hardware or that is Hardware that is optimized for the Oracle database which is scale out and database optimize compute networking and storage all integrated in one single platform power it up with database aware system software such that you can go deeper into the software and with the aid of unique algorithms vastly improve your data models and workloads and provide end-to-end automated management and then on top of that also give customers the option to make this available on premises clouded customer or in the public cloud if you dig deeper with accelerator exit data in itself is a cloud scale platform comprised of fill out two socket database servers completely optimized for the Oracle database scale out intelligent to socket storage servers that's where a lot of the excellent intelligence runs and they're all connected at the back end with ultra fast that is lowly you can see high bandwidth 100 Gig RDMA over converged ethernet we call it Rocky internal fabric but that's the hardware where is the secret sauce the fundamental benefit of accelerator y as a platform it can provide tremendously tremendous benefits with regards to availability reliability and performance in a completely application agnostic manner is what we have done with regards to accelerator system software with accelerator system software we provide Oracle database workload the ability to move database processing at the storage level and when we can move it at the storage level in a completely parallel way in a massively parallel way we take advantage of storage co-processing such that we can we can offload all of this processing at the storage layer thereby releasing the database service layer the database server layer to serve additional workloads that may come in with regards to when the business takes off when the business needs to scale so that net net of this is having this as the fundamental value proposition and along with everything else that we have done with Excel data to handle complex workloads it comes across or it can be utilized as the most powerful data management platform for Mission critical scale workloads today so I talked about workloads and I talked about about everything else that we have done on top of this fundamental building blocks so when you look at the workloads that an accelerator platform can support today the fastest oltp and that is aided by components and Technologies such as scale out storage rtma memory accelerators nvme Flash and all of these is integrated with each other both at the software and the hardware layer using deep algorithmic techniques so that's oltp for analytics we are looking at massively parallel processing so that's where smart scan comes in smart flash cache storage indexes unique columnarization again all of them working together to provide the fastest analytics workload support as possible and then many of the our customers can run these workloads together in a highly Consolidated Manner and using accelerator native resource management to prioritize handling of the accelerator resources across this workloads based on user selected priorities without any bottlenecks and also ensuring workload isolation so the net net of this is with excadata as the underlying platform now Sophie and srini when they're building their mission critical applications they don't need to worry about okay Do my applications need to cater towards old TP do I have my application indicator 2 analytics would there be any resource conflict resource contention all of this is automatically taken care of by accelerator so that takes care of the core accelerator platform you know we talked about Hardware we talked about software within the database side the core Oracle database side we have performance features such as Oracle database in memory Oracle database in memory is a really Innovative feature where we can integrate again in both oltp and analytics together without any application changes such that we can use the traditional row model for oltp Access and we can use the an in-memory columnar model for analytics access and both of these can coexist together the database can automatically figure out the workload types thereby delivering the best of both worlds without changing the applications so whether it's oltp whether it's extremely fast data warehouses does not matter you can run using database in memory in a highly performant way and we have various technologies that we have also integrated along with in memory so for example how we have integrated with the hardware technology called Cindy how we can apply these Technologies for for database processing such as fast cans joins aggregations Bloom filters Vector processing in combination this in-memory technology can drive execution of both oltp and analytics together at a blazing scale and then you may question okay so this is database technology in memory how does this get better with exadata now what happens is accelerator and in memory are completely integrated together such that what I talked about in the database servers with regards to carving out in memory layers in the database area using the database dram the same columnar technology benefited by CMD processing can now be extended to exadata storage server using the storage server resident Flash and in that way two things happen this is completely transparent extending the in-memory layer from the database server dram to the storage server flash that's number one so the in-memory capacity increases number two a massive increase in performance as well and that's completely transparent because the same in memory columnar columnarization and the processing can extend seamlessly across database servers and storage servers and we have facilities such that if you do not want to use the database server resident dram for any in-memory processing fine you can just use the storage server flash for in-memory processing thereby again getting benefits of in-memory processing in an extra data context so we are going through a journey so we talked about complex this Mission critical application handling complex data types it has to do the processing extremely fast when we talked about accelerator and in-memory technology the third requirement it needs to be accessed by lots of customers and that's where elastic scalability with rack and sharding comes up rack is a fundamental Oracle database scalability and high availability technology and it has been around for several years or for any of our customers who are deploying highly available highly scalable applications you can assume rack is a de facto technology of choice essentially Rack or real application clusters makes it extremely simple to transparently scale applications across a pool of database servers all of them accessing a unified set of storage that's the fundamental value proper track as the workload increases you can simply add database servers or even storage servers and thereby you are able to scale in a completely load balanced way as more and more traffic comes in as the application developed by srini and Sophie gets popular not an issue you can add database processing and storage processing in a completed elastic way and of course on top of that because it is a clustered model it can also help from any outages if one of the servers go down that's fine the remaining servers can take a processing in a completely active active manner so that's Rag and then what we have done gone beyond rack for a geo-distributed model with database sharding database sharding looks at availability in a distributed database concept and in this distributed database concept you can have certain tables where you say certain elements of the stable are certain sets of this table can reside on this particular database or that particular database and you can horizontally distribute the data this way and often this is Guided by the principles of say for example data sovereignty so essentially what you can do is lay out a shared nothing architecture device such as database one has certain sets of data database 2 has certain sets of data database 3 has certain sets of data and at this moment you may say wait a minute so I am Sophie I am srini my applications accessing three different databases no your applications can still access the same one logical database and the underlying data distribution and the application Direction and The completely parallel horizontal processing is instrumented by this technology Oracle database sharding so netnet is massively parallel processing completely horizontal scale out and ideal footprint for data sovereignty requirements so where are we in this journey again trying to rehash a converged data model and workload integration extremely fast processing complete scale out the fourth requirement never down architecture and that's where our maximum availability architecture comes in in essence maximum availability architecture is our Oracle recommended Oracle database recommended blueprint of how all of our high availability Technologies can interplay together thereby offering a highly available platform and architecture which provides protection from unplanned outages or even from plant maintenance activities it uses Technologies such as active Data Guard and Golden Gate for replication of the data thereby keeping multiple physical and logical copies of the data it uses application it uses technology such as application continuity online redefinition for continuous availability it uses Technologies such as zdlra flashback rman for data protection and I already talked about how you can scale out using Technologies such as rack and sharding what we have done with maximum availability architecture is we have taken all our Decades of customer deployment experience and we have embedded that knowledge in this configuration and we have recommended blue prints based on your application requirements so you can roll out this architecture in a phased Manner and we internally we call them the brown the silver the gold and the Platinum architecture such that again based on your high availability SLS you can take a step-by-step manner I talked about active Data Guard which is our underpinning for data protection and Disaster Recovery by the ability to keep synchronized copies of the production database and so thereby with any outage it's okay you can simply fail over manually or even automatically without any application changes and we have hooked such the applications can be automatically directed to the new production site without any data loss and at this moment you may say wait so if I'm keeping all this production database copies around isn't there a lot of wastage of all these copies and the idea is not really because all these copies if you are using active data card for example they can also serve read access while protecting your data set thereby again if you are 3D and Sophie if you have read modules suppose you are building a healthcare app the transactional modules can be in the production database but if I am a patient if I am looking up my history or my upcoming appointments essentially these are read modules all of those can be directed towards the synchronized copies thereby kind of doing load balancing and also helping on the or increasing the ROI of your data protection investment so that is the active Data Guard and we have taken or we have architected several other Technologies one of them transparent application continuity such that for example again a very busy fast processing Healthcare application accessing the database a lot of components here and boom something happens some error some Network outage or some may be like Hardware component failure and typically the application gets an error message back and the application has to write complex or handling code with Technologies such as application continuity applications don't need to write this complex error handling code they can be redirected to any surviving nodes say for example a rack instance real application cluster instance and we can replay the SQL state that was in in process in the previous transaction and thereby it is completely transactional hence it's very transparent and as long as applications use our recommended drivers and recommended connection pools this failover and then also replaying the transactions is transparent as far as the applications are concerned so I talked about availability but I was focusing more on the unplanned outages like failures what about planned maintenance events so for example Sophie and srini have come up with this database and now the data model has gotten complex the underlying table here for example which used to be only four columns now it needs to be seven columns or any other physical structure of the tables or anything else related to the objects need to change sure enough all of these changes can happen while the applications are accessing the table in a completely online manner we track the changes and once they're ready you can just resync the table with the resulting original table with the resulting table and voila applications access the new table with a new set of columns so the new set of attributes and thereby we have done an auger reorganization of the table without any application downtime at this moment you say fine I get that you have done evolution of the underlying objects on the table what about the application itself yes we have this ability called addition based redefinition that applications the old version of the application say V1 can look at this view we call the additioning view and then we upgrade the application which is application say addition to the V2 and the V2 can be accessing its own isolated view on the same database and thereby maybe you're running V1 in a traditional model you're running application to the V2 in a test and Dev model when you are think you're ready this is well tested all the application errors have been fixed and it is well tested performance scalability availability you're ready to switch yeah sure you can switch over to the V2 model and that becomes a new application all of this happening with the underlying data management layer supporting a completely online experience without any downtime so these are various facets of the maximum availability architecture the fifth requirement was security extreme security and that's where maximum security architecture comes in and also another platform that we have built recently that's called recovery Appliance let me talk about both of them maximum security architecture fundamentally the premise is similar to how maximum availability architecture as evolved what maximum security architecture focuses on is use Oracle database intrinsic Technologies to provide protection at both layers number one it provides protection for the data that is already in the Oracle database number one number two it provides protection from malicious access to the data so and it works in tandem all of the Technologies I list here they are all integrated with each other providing protection across both these both these principles protecting the data protecting the access so for example Technologies such as transparent data encryption with that Sophie and srini will now have the ability to use this database native encryption Technologies such that if it is sensitive Healthcare data it can be completely encrypted and you may ask hey what about Keys sure we have another technology ancillary technology called key vault which can provide encryption key Enterprise level encryption key storage and life cycle management Technologies such as data redaction suppose you want to create test and Dev databases before rolling out your production applications sure using data reduction and data masking you can obscure sensitive data Maybe again Healthcare data Social Security data credit card data such that any test and development applications may not necessarily see that data using Technologies such as database firewall and database vault these are various ways you can provide very granular protection from malicious access to the database including deep monitoring of database messages in the SQL level and while this is happening you can do centralized auditing all of this access and even inside the database you can provide layers of security using Technologies such as label security and VPD and finally a recent Edition blockchain table you can make this table completely immutable such that this table can be used only in an append only manner and you can provide database intrinsic assurances that is data that is already in the table has not been not been changed or tampered with in any way using a hash based signature in the row level metadata so that's the net net of Maximum Security architecture uh comprehensive Suite of secreted Technologies all integrated with each other at the database level such that applications using this database automatically benefit I mentioned another technology called recovery Appliance and that is similar to Excel data it is a platform which provides granular data protection and the database Block Level and if there is any other disaster or any disaster that might have impacted your sweeter production databases sure the backups at that till the latest block they're available at the recovery appliances they have processed completely secured and they can be restored at any time and by providing a recovery Appliance integrated replication to what we refer to as cyber world we are seeing customers using using this technology along with maximum security architecture um for cyber security threat protection and also ransomware protection so this is a very popular use of our technology now that is recovery Appliance in the area of combating ransomware you have your the latest copy of the data till the latest database blog so even if something happens to your production database you can restore this data set from recovery Appliance and get going with your business while this is an engineered system that can be deployed on premises we have also recently released what we labeled as zero data loss autonomous Recovery Service in our Cloud such that any Cloud database deployment models can also automatically benefit from the same data protection architecture so I am in the context of security right I just rounded up security so I started off with complex data models I talked about performance I talked about availability scalability security and finally 3D and Sophie are almost convinced but not not quite there yet because now they want the flexibility to be able to deploy this application anywhere in the cloud or on premises or a mix of multi-club or mix-up Cloud that is multi-cloud or hybrid cloud that's where Oracle database cloud services come in and we have done phenomenal progress in this in fact if you look at Oracle Cloud infrastructure global locations 40 plus regions deployed by now fully functional by now and lots of lots of database application storage networking lots of cloud workloads happening across these Global regions we also have 12 Azure interconnect regions where we Integra connect in an integrated manner with Microsoft Azure and I will talk about that in a few in a few minutes and this is a very attractive value proposition as far as Oracle database cloud services are concerned why is that because it does not matter Sony and um Sophie and srini where they are developing and deploying their application as long as they're targeting the Oracle database the same application can be deployed on premises or it can be deployed in our Cloud without any application or data model changes and it can be deployed in any of these models either in a public Cloud manner or in a hybrid Cloud manner which we refer to as clouded customer where we essentially build bring the cloud services to you and there are various versions of this database Cloud one is the base database service the other one is the accelerator database service where the underlying platform is accelerator and a completely fully managed service using some of the latest Ai and ml capabilities and that is the autonomous database but fundamentally all of this is fundamentally still the Oracle database We Believe autonomous database the fact that it is completely self-managed using Ai and ml models it takes away all of your management headaches and that way it offers the best TCO as far as any Mission critical application is concerned so specifically with regards to autonomous database the underlying platform still continues to be accelerated which provides complete infrastructure automation we have added a lot of Automation in the core Oracle database that are specific to autonomous database so complete database Automation and then on top of it Cloud Ops we have completely automated the data center Cloud Ops monitoring and managing all these database deployments and in tandem all of them together combine to deliver the benefits of self-driving self-managing Oracle autonomous database and really when you look at the functional pillars automatic provisioning automatic configuration automatic encryption automatic online patching updating elastic scaling automatic tuning all of this is taken care of by the autonomous database which really means from an application developer standpoint you are not really worrying about infrastructure maintenance because that's the last thing you want to worry about you want to focus on your application and handling complex business logic and autonomous database enables you to do that so I talked about all our cloud services I mentioned Cloud at customer what happens if you're an Azure developer you like the Oracle database from a database management perspective everything that we offer in the database layer security availability performance reliability but you're already used to the Azure ecosystem sure I presented earlier our Oracle oci region map and I pointed out there are 12 regions in the world where we have integrated in a transparent and seamless manner the Azure region with the oci region which net net that means you can continue to build your applications and tools in the Azure Cloud ecosystem but at the back end your databases could be in oci leveraging either accelerated database service the base database service or even autonomous database and thereby getting best of the both worlds without worrying that okay how do you integrate these multiple clouds together so you get the best of the Azure Cloud ecosystem from a Dev framework standpoint and you get the best of the Oracle database from a mission critical applications standpoint and we see a lot of our customers very excited with this because this enables them to embark on a really win-win multi-cloud strategy so let's summarize where are we on the left hand side is what Sophie and srini started with complex data models extremely fast processing the ability to be accessed by lots and lots of customers around the world cannot go down it has to be extremely secure and hey I should be able to deploy it anywhere on the right hand side at present some of the Oracle database the Oracle data management ecosystem um components which address all of this converge data management accelerator and database in memory for performance rack and shorting for scalability maximum availability architecture for never down applications maximum security architecture and cdlr for extremely secure and ransomware protection use cases and finally Oracle database cloud services and our multi-cloud value proposition a multi-cloud offering for deployment flexibility so I think the combination of all of this I think we are able to convince Sophie and srini that Oracle database with its unified set of capabilities indeed makes developing and deploying demanding Mission critical applications uniquely simple if you are interested to learn more about this do visit oracle.com database and if you want to try on many of this you can try them free Hands-On and I'm presenting in front of you various URLs and QR codes that you can scan for hands-on experience with a lot of these Technologies thank you for your time and interest in how from an Oracle database perspective we can make building and deploying Mission critical applications extremely simple thank you

2023-04-19

Show video