Daily Tech Digest - November 29, 2018

Closing the Awareness Gap in Technology Projects


The symptoms of a problem with operational awareness can vary. Sometimes you fail to obtain visibility at the level of accuracy you need; sometimes you get that visibility, but don’t know how to act on it; sometimes, even when insights lead to actions, these actions fail to lead to your desired results. If you’re trying, for example, to reduce time delays, your data analytics might show which parts of your project are moving more slowly than expected, but they’re unlikely to pinpoint the precise reason. Problems in one place might be the result of decisions made several steps back in the supply chain or project life cycle. Was planning off? Did procurement write a poor contract? Maybe your workers lack the necessary skills? The experience of using the system may also make it difficult for you and your employees to make sense of the data effectively. For example, in our work with boards of directors, who are taking a growing role in overseeing high-value projects, we sometimes observe members relying heavily on dashboards or documents developed with sophisticated data analytics. 



Three steps toward stronger data protection

Applications responsible for originally sourcing data into the system or modifying data as part of business transactions should also be responsible for digitally signing data before persisting them into databases. Any application retrieving such data for business use must verify the digital signature before using the data or refuse to use data whose integrity has been compromised. These are concrete steps companies can begin to take immediately to protect themselves. Enabling FIDO within a few weeks into web applications is now possible; incorporating encryption with secure, independent key management systems into applications can be accomplished within a few months. Integrating digital signatures may be accomplished at the same time as encryption or pursued as a subsequent step. By enabling these security controls, companies place themselves far, far ahead of where the vast majority of attacks currently occur.


Google Faces GDPR Complaints Over Web, Location Tracking
Even though Location History is off by default, Google appears to encourage its users to turn it on through overly simplified and carefully designed user interfaces that may drive users to hit "approve." In contrast to the ease of enabling the feature, any user who wants to research what their choice might mean must undertake extra clicks or explore multiple submenus, Forbrukerrådet's report contends. These design choices may contradict GDPR's requirement for "specific and informed" consent, Forbrukerrådet says. "Users will often take the path of least resistance in order to access a service as soon as possible," the report says. "Making the least privacy friendly choice part of the natural flow of a service can be a particularly effective dark pattern when the user is in a rush or just wants to start using the service." Forbrukerrådet contends that if users don't click on Location History at the start, Google keeps trying to get them to enable it. For example, the report contends that in order to keep location-tracking disabled, users must again decline it when trying to use Google's Assistant, Maps and Photos apps.


Data Science “Paint by the Numbers” with the Hypothesis Development Canvas

The one area of under-invested in most data science projects is the thorough and comprehensive development of the hypothesis or use case that is being tested; that is, what it is we are trying to prove out with our data science engagement and how do we measure progress and success.  To address these requirements, we developed the Hypothesis Development Canvas – a “paint by the numbers” template that we will populate prior to executing a data science engagement to ensure that we thoroughly understand what we are trying to accomplish, the business value, how we are going to measure progress and success, what are the impediments and potential risks associated with the hypothesis. The Hypothesis Development Canvas is designed to facilitate the business stakeholder-data science collaboration


6 Tips To Frame Your Digital Transformation With Enterprise Architecture


Call it digital transformation strategy—call it smart business—enterprise architecture is a method your company can use to organize your IT infrastructure to align with business goals. This isn’t a new concept. In fact, enterprise architecture has been around since the 1960s. But the overwhelming presence of tech in every facet of business today has forced us to rethink it, and to make it a more central focus of business management. ... Enterprise architecture deals with your organizational structure, business model, apps, and data just as much as it does information technology. When you put it together, you need to think from an employee perspective, a customer perspective, and from the perspective of meeting your business goals. After all, your digital transformation will impact your entire company, and your enterprise architecture will need to support it. Your enterprise architecture is of no use to anyone if no one but IT geeks can understand it. When you develop it, use common language. Create easy-to-understand examples.


Machine learning and the learning machine with Dr. Christopher Bishop

The field of AI is really evolving very rapidly, and we have to think about what the implications are, not just a few years ahead, but even further beyond. I think one thing that really characterizes the MSR Cambridge Research lab is that we have a very broad and multi-disciplinary approach. So, we have people who are real world experts in the algorithms of machine learning and engineers who can turn those algorithms into scalable technology. But we also have to think about what I call the sort of penumbra of research challenges that sit around the algorithms. Issues to do with fairness and transparency, issues to do with adversaries because, if it’s a publication, nobody is going to attack that. But if you put out a service to millions of people, then there will be bad actors in the world who will attack it in various ways. And so, we now have to think about AI and machine learning in this much broader context of large scale, real-world applications and that requires people from a whole range of disciplines.


Cloudlets extend cloud power to edge with virtualized delivery


With a cloudlet, there tend to be fewer users and they connect over a private wireless network. Cloudlets are also generally limited to soft-state data, such as application code or cached data that comes from a central cloud platform. In some ways, cloudlets are more like private clouds than public clouds, especially when it comes to self-management. With both cloudlets and private clouds, organizations deploy and maintain their own environments and determine the delivery of services and applications. Cloudlets also limit access to a local wireless network, whereas private clouds are available over the internet and other WANs to support as many users as necessary -- although nowhere near the number of users public clouds support. The private cloud theoretically serves users wherever they reside, whenever they need and from any device capable of connecting to the applications. In contrast, cloudlets are specific to mobile and IoT devices in close proximity.


KingMiner malware hijacks the full power of Windows Server CPUs

The malware generally targets IIS/SQL Microsoft Servers using brute-force attacks in order to gain the credentials necessary to compromise a server. Once access is granted, a .sct Windows Scriptlet file is downloaded and executed on the victim's machine. This script scans and detects the CPU architecture of the machine and downloads a payload tailored for the CPU in use. The payload appears to be a .zip but is actually an XML file which the researchers say will "bypass emulation attempts." It is worth noting that if older versions of the attack files are found on the victim machine, these files will be deleted by the new infection. Once extracted, the malware payload creates a set of new registry keys and executes an XMRig miner file, designed for mining Monero. The miner is configured to use 75 percent of CPU capacity, but potentially due to coding errors, will actually utilize 100 percent of the CPU. To make it more difficult to track or issue attribution to the threat actor, the KingMiner's mining pool has been made private and the API has been turned off.


Managing a Real-Time Recovery in a Major Cloud Outage

system fail situation in network server room
While Always On Availability Groups is SQL Server’s most capable offering for both HA and DR, it requires licensing the more expensive Enterprise Edition. This option is able to deliver a recovery time of 5-10 seconds and a recovery point of seconds or less. It also offers readable secondaries for querying the databases (with appropriate licensing), and places no restrictions on the size of the database or the number of secondary instances. An Always On Availability Groups configuration that provides both HA and DR protections consists of a three-node arrangement with two nodes in a single Availability Set or Zone, and the third in a separate Azure Region. One notable limitation is that only the database is replicated and not the entire SQL instance, which must be protected by some other means. In addition to being cost-prohibitive for some database applications, this approach has another disadvantage. Being application-specific requires IT departments to implement other HA and DR provisions for all other applications.


Reputational Risk and Third-Party Validation

Security ratings are increasingly popular as a means of selecting and monitoring vendors. But Ryan Davis at CA Veracode also uses BitSight's ratings as a means of benchmarking his own organization for internal and external uses. "Taking somebody's word for it isn't enough these days," says Davis, an Information Security Manager at CA Veracode. "You can't just say 'Oh, yeah, well that person said they're secure ..." For CA Veracode, security ratings provided by BitSight offer validation to prospective customers. "We want [customers] to be able to have that comfort that somebody else is also asserting that we're secure." In an interview about the value of security ratings, Davis discusses:
How he employs BitSight Security Ratings; The business value - internally and externally; and How these ratings can be a competitive differentiator. Davis is CA Veracode's Information Security Manager. He is responsible for ensuring the security and compliance of thousands of assets in a highly scalable SaaS environment. Davis has more than 15 years of experience in information technology and security in various industries.



Quote for the day:


"Without courage, it doesn't matter how good the leader's intentions are." -- Orrin Woodward


No comments:

Post a Comment