Under the Health Insurance Portability and Accountability Act (HIPAA), cloud service providers aren’t considered business associates, which are entities that use or disclose protected health information (PHI). Companies that perform services such as claims administration, quality assurance, benefits management, and billing qualifies as business associates. That said, Chung encouraged healthcare organizations to push their CSPs to sign a business associate agreement, or BAA, to ensure that the provider assumes responsibility for safeguarding the organization’s PHI. “If a CSP is not willing to sign a BAA, then you have to ask yourself, “Do they treasure your data as much as you do?” Chung said. “The BAA provides assurance to organizations that we protect their data, that we provide training to our employees, and that we store and process consumer data securely.” Healthcare’s traditional network perimeter no longer exists. Many physicians and nurses may work at multiple locations for the same institution, sometimes visiting several locations in one day, or clinical staff may conduct research at a nearby university.
In traditional environments, we have predefined builds that can be weekly, fortnightly or sometimes even monthly. One of the reasons is that these deployments take time. The problem with this approach is that we have to wait for the predefined dates to get the bugs fixed or to get the new features implemented, so there is a delay. The second reason was – by the time testers finish up with the testing and come up with bugs and defects, the programmers have moved on to different pieces of implementation and have less interest in resolving the bugs of the older application. This approach also delays the time for making the feature available in production. Building and deployments are the entities that are repetitive and sometimes boring.. ... Automating the testing behind the GUI is comparatively easier than automating the actual GUI. Another advantage is that irrespective of the UI changes, functionality remains intact. Even if some of the UI element is changed, the functionality of the feature does not change. This technique mainly focuses on the business logic and rules.
The major public clouds and several database vendors have implemented planet-scale distributed databases with underpinnings such as data fabrics, redundant interconnects, and distributed consensus algorithms that enable them to work efficiently and with up to five 9’s reliability (99.999% uptime). Cloud-specific examples include Google Cloud Spanner (relational), Azure Cosmos DB (multi-model), Amazon DynamoDB (key-value and document), and Amazon Aurora (relational). Vendor examples include CockroachDB (relational), PlanetScale (relational), Fauna (relational/serverless), Neo4j (graph), MongoDB Atlas (document), DataStax Astra (wide-column), and Couchbase Cloud (document). ... Companies with large investments in data centers often want to extend their existing applications and services into the cloud rather than replace them with cloud services. All the major cloud vendors now offer ways to accomplish that, both by using specific hybrid services (for example, databases that can span data centers and clouds) and on-premises servers and edge cloud resources that connect to the public cloud, often called hybrid clouds.
Early success stories are encouraging, but they are the exception rather than the rule. More often, firms have started the transformation journey but have faltered along the way. Common reasons include a lack of ownership at senior levels and budgetary or strategic restraints that prevent project teams from executing effectively. The challenges of transforming service models are significant but not insurmountable. Indeed, as analytics use cases become more pervasive, implementation at scale becomes more achievable. In the following paragraphs, we present five ingredients of an analytics-based transformation (Exhibit 3). These can be supported by strong leadership, a rigorous focus on outcomes, and a willingness to embrace new ways of working. Indeed, managers who execute effectively will get ahead of the competition and be much more adept in meeting client needs. ... Analytics-driven transformations are often restricted to narrow silos occupied by a few committed experts. As a result, applications fail to pick up enough momentum to make a real difference to performance.
Firstly, data science has never been about re-inventing the wheel or building highly complex algorithms. The role of a data scientist is to add value to an organization with data. And in most companies, only a very small portion of this involves building ML algorithms. Secondly, there will always be problems that cannot be solved by automated tools. These tools have a fixed set of algorithms you can pick from, and if you do find a problem that requires a combination of approaches to solve, you will need to do it manually. And although this doesn’t happen often, it still does — and as an organization, you need to hire people skilled enough to do this. Also, tools like DataRobot can’t do data pre-processing or any of the heavy lifting that comes before model building. As someone who has created data-driven solutions for startups and large companies alike, the situation is very different from what it’s like dealing with Kaggle datasets. There is no fixed problem. Usually, you have a dataset, and you are given a business problem.
More organizations are starting to fully adopt Infrastructure-as-Code (IaC) to create fully autonomous cloud-based environments. From a security perspective, ensuring that the supply chain from the code to production is protected and monitored is becoming an increasing concern for organizations. We are seeing tools in this space starting to mature, and new strategies are being implemented. For example, you can do things like pre-validation of configurations and architecture, ensuring your architecture and code are compliant and secured before it even moves to production. ... Multi-cloud strategies are here to stay – and many enterprises are picking technologies best suited for their platforms while also creating resilient architectures that utilize more than one cloud service provider. We will soon see this adoption model mature along with multi-cloud security practices and tools. Additionally, we see “multi-cloud” enveloping edge computing, which will continue to extend onto factory floors, as well as into branch offices and private data centers.
A composable enterprise aims to create an application architecture wherein enterprises can deliver various functions through composition (as against development), by leveraging packaged business capabilities (PBCs). Gartner estimates that by 2023, 30% of new applications will be delivered, priced, and consumed as libraries of packaged business capabilities, up from fewer than 5% in 2020. To be fair, this run-up to a composable enterprise is not a fresh-of-the-press revelation. Enterprises have been attempting to move from hardcore coding-based development to a more service-and-composition-oriented architecture over the last couple of decades, albeit only in pockets and not as fast as they would have wished. Composable enterprise has become the need of the hour. And this is being driven by the sense of urgency created by multi-faceted disruption across industries, coupled with technological advancements that make it possible for organizations to accomplish it at an enterprise scale.
The reality of the digital shift is that consumers are no longer constrained by how far away something is: the item they want to buy, the service provider they want to engage, the employer they want to work for, the trainer they want to buff them up, or the concert or movie they want to watch. Or just about anything else they want to do. Technology is making once-physical interactions immersive digital experiences – sometimes complementing the physical world, and sometimes replacing the activities once done there. For businesses, this is both a threat and an opportunity – an undeniable dynamic driving the evolution of the connected economy. In retail. In grocery. In entertainment. In work. In banking. In just about everything — including many healthcare services. Proximity is no longer a barrier, and those who wish to make it a competitive advantage now have to one-up the digital alternatives that consumers find easier, more convenient and less wasteful of their precious time.
The CTO role also entails effective management of risks, which are also changing all the time as the organisation innovates. Finding possible risks, and planning how to mitigate them as early as possible is particularly important for any digital transformation initiatives such as cloud migration. ... “However, it is inevitable that as part of that migration there will be some component misconfigured, a vulnerability uncovered in a new technology, or a human error that introduces an unintended path to access a system. The CTO should understand the possible impacts a breach in a specific application could have to the business as a starting point, and then assess a difficult question – how likely is that risk to be realised? “As CTO, you must consider all the surrounding process and infrastructure needed to mitigate the security risks of an initiative. Are the assumptions you are making about the capabilities of third party vendors, and your own security organisation, accurate today and in the future? Perhaps the ROI won’t be quite as high if this is fleshed out in detail upfront, but that will be a far better result than being caught flat-footed after a production roll-out.”
Algorithm-powered recommendation services offer relevant suggestions for users based on their history of choices and are popularly used by video streaming services, e-commerce companies and dating apps. The CAC's regulation, however, is not confined to just search results or personalized recommendation algorithms that push e-commerce products. It also applies to dispatching and decision-making algorithms that are used by transport and delivery services and to generative or synthetic-type algorithms used in gaming and virtual environments, says Ed Sander, China tech and policy expert and co-founder of the technology and business website ChinaTalk, in a blog post. Companies that use algorithm-based recommendations are required to disclose service rules for algorithmic recommendations and periodically review, assess and verify algorithm mechanisms, according to the regulation. The new regulation also says that companies must ensure that their algorithmic models do not induce users to "become addicted, spend large amounts or indulge in activities that go against good public customs."
Quote for the day:
"A leader should demonstrate his thoughts and opinions through his actions, not through his words." -- Jack Weatherford