Daily Tech Digest - December 20, 2019

How 8 short months changed the cloud landscape


Even the most staunch believers in a strictly public cloud model are recognizing that the majority of organizations are just not able to go all-in on public cloud. Between legacy systems, security/compliance requirements, and the need for flexibility, end-user organizations want solutions that allow them to move seamlessly between environments if necessary. Hybrid and multicloud are now becoming standard. This same customer pressure is the number one factor driving the increase of mix and match cloud offerings. Beyond customer pressure, mix and match cloud has also progressed in part because customer momentum around multicloud has also progressed. Initially it was the regulated companies — mainly financial services — that embraced multicloud, but now demand has moved beyond these early adopters to the enterprise mainstream. After IT decisions makers see some success in using public clouds and start mapping the future, they often learn the reality of needing hybrid as part of the plan — and that it is smart to have multicloud as part of the plan as well.



Your Security Strategy Should Scale and Evolve Alongside Your Business


While all workers in a small business may know each another, large businesses often have greater numbers of employees, contractors and partners working in scattered locations and sometimes speaking different languages, which can lead to information security challenges. Likewise, interpersonal dynamics (i.e., office politics) are far more likely to affect human-factors issues for larger businesses than mom-and-pop operations. Also, because no one person can be an expert on all of the organization’s systems and technologies, proper communication becomes essential to maintaining information security, and communication failures or breakdowns can lead to dangerous mistakes or vulnerabilities. It’s no secret that people occasionally miscommunicate, so these risks, which small businesses consider far less frequently, must be mitigated in large businesses. As such, implementing formal policies and procedures around communication can help reduce human complexity issues significantly by outlining the specifics of proper communications and limiting the potential for riskier techniques.


Machine learning ops to lead AI in 2020


Commensurate with the growth of third-party model usage is the shift to consumption-centric approaches to ML model usage. The market is already shifting in the way that organizations approach model development and usage. While up to this point the primary emphasis has been on model development and creation for use by a single organization, the shift to consumption-centric models will require tooling and environments that have the specific needs of model users vs. model developments. In 2020, we'll see the growth of machine learning ops infrastructure that provides a range of functionality and capabilities for those looking to consume models. Machine learning ops systems are meant to simplify the usage and consumption of various AI and machine learning models that were built in-house or by third-party vendors. The platforms will offer features including model governance for controlling, limiting or prioritizing access to models for different users, allowing for collaboration in model usage among various team members.


Google Cloud’s 5 Boldest Moves of 2019

Google Cloud’s 5 Boldest Moves of 2019
Over the summer it added a new security feature that puts context aware capabilities in Cloud Identity-Aware Proxy, which allows companies to define and enforce granular access policies for apps and infrastructure based on a user’s identity and the “context” of their request, such as users’ location, time of day that they are trying to access a particular app, or the security status of the device. This is good for both the enterprise and its employees: it improves a company’s security posture and means employees have easier access to the cloud and workloads running in the cloud on any device without using a virtual private network (VPN) client. A couple months later Google added a security analytics product that identifies misconfigurations and compliance violations in companies’ GCP environments. And in November at the U.K. edition of its Cloud Next event, it unveiled a ton of capabilities around data encryption, network security, security analytics, and user protection.


How The Cloud Can Solve Life Science’s Big Data Problem

Genetic engineering and digital technology concept.
A growing number of large and small companies seem to agree with Benchling’s approach. Benchling announced today that it added 150 new clients to its roster in 2019 — more than doubling its customer base for the second consecutive year — with a significant expansion of enterprise customers. Biopharma was its largest and fastest-growing customer sector, but it also saw strong initial deployments in a range of new industries, including biomaterials, energy, consumer goods, and food and beverage. Earlier this year, Benchling closed $34.5 million in Series C funding to extend its product lead and expand commercial relationships and opened its Cambridge, MA, office in April. Since launching in 2012, Benchling says it has grown to become the most widely adopted life science R&D cloud software, used by over 170,000 scientists worldwide. What’s most impressive about its customer base is its diversity: from start-ups to multinationals, in every sector imaginable, including many of synthetic biology’s best-known names — Zymergen, Synlogic, Regeneron, and Intellia, to name a few.


How machine learning can transform AML

Traditional AML solutions resort to hard segmentations of customers based on the KYC data (often inaccurate or aged) or special sequences of historical events. However, customers are too complex to be assigned to such hard-and-fast segments and need to be monitored continuously for anomalous behaviour through soft segmentation. The best approach is to aggregate customers’ banking transactions and generate archetypes of customer behaviour. Each customer is a mixture of these archetypes, which are adjusted in realtime based on their financial and non-financial activity. Looked at in an “archetype space”, good customers are similar in behaviours, giving the appearance of customer cluster grouping. Machine learning models can detect and rank-order with scores those that fall in areas of low customer density or areas of extremely high similarity in behaviour. Machine learning can be used to bring the most urgent alerts to a human officer’s attention, which helps reduce false positives. For example, the FICO AML Threat Score prioritises investigation queues for SARs, to ensure the most pressing cases are looked at first.


Frankfurt
The infections across Germany aren't a surprise. In recent weeks, the Emotet gang has started targeting German users more often. On the same day the cities of Frankfurt and Bad Homburg were infected, the BSI sent out a security alert warning German organizations about an Emotet email spam campaign that was mimicking German government agencies -- and most likely the method through which the two cities were infected. Joseph Roosen, a member of Cryptolaemus, a group of security researchers who track Emotet campaigns, told ZDNet that the Emotet operators often translate their email spam templates to German and target the country's users. ... At this point, it is very clear that the Emotet gang is putting quite the effort into infecting German targets, something it hadn't done before on this scale. While we've seen cities shut down networks in the past, this usually happened because of ransomware attacks. What German cities are doing now is a first. No cities have reacted like this in the case of an Emotet infection.


Master Data Management: A Modern Guide for Data Governance Professionals

Once you have this framework, you can aggregate your data, standardize it and match values. You can link and synchronize your records to align user and application data with the master set. Data quality tools support MDM frameworks by cleansing and normalizing data and removing errors and duplicate values. This has all become extremely important – and difficult to achieve – in our data-driven age. Without MDM, large enterprises struggle to manage linkages and trends across the business. The data engineer at a large insurance firm recently described his team as “data-rich, but information poor” because each of a dozen recently-acquired business units had slightly different definitions of automobile status. Definitions ranged from “new” and “almost new” to “used” and “certified pre-owned.” So it’s no surprise that MDM is moving from “nice to have” to “must-have” in many CIO budgets. Structured data volumes keep rising; platforms and applications keep sprawling; user groups and workflows keep drifting. 


Edge computing trends for 2020s send internet into a new era


Edge computing will also affect other use cases, such as manufacturing, retail, healthcare, automotive and residential environments. However, as the need for speed and compute closer to data sources grows, these use cases may shift, and organizations that provide infrastructure services may glean the most benefits from edge computing. "Worldwide, general availability of edge computing beyond these early use cases will become commonplace sometime between 2025 to 2028," Trifiro said. "Providing an edge computing environment for a factory floor might happen earlier." Use case deployment may depend on the existing devices and architecture of an organization's network. The internet was developed for humans, yet humans aren't the only things that now use the internet or require it to function. Communication among various machines, devices and applications requires faster architecture to function efficiently, and edge computing aims to answer this call.


Get 2020 vision about edge computing and 5G

Edge computing exists along a spectrum, from the device edge to the infrastructure edge.
Edge computing also saves you from shuttling every bit of data back and forth between connected devices and the cloud. If you can determine the value of information close to where it’s created, you can optimize the way it flows. Limiting traffic to just the data that belongs on the cloud cuts down on bandwidth and storage costs, even for applications that aren’t sensitive to latency. Reliability stands to benefit from edge computing, too. A lot can go wrong between the device edge and centralized cloud. But in rugged environments like offshore platforms, refineries, or solar farms, the device and infrastructure edges can operate semi-autonomously when a connection to the cloud isn’t available. Distributed architectures can even be a boon to security. Moving less information to the cloud means there’s less information to intercept. And analyzing data at the edge distributes risk geographically. The endpoints themselves aren’t always easy to protect, so firewalling them at the edge helps limit the scope of an attack.



Quote for the day:


"Every great leader has incredible odds to overcome." -- Wayde Goodall


No comments:

Post a Comment