Daily Tech Digest - July 09, 2019

Colocation facilities buck the cloud-data-center trend

CSO > cloud computing / backups / data center / server racks / data transfer
Poole said the average capital expenditure for a stand-alone enterprise data center that is not a part of the corporate campus is $9 million. Companies are increasingly realizing that it makes sense to buy the racks of hardware but place it in someone else’s secure facility that handles the power and cooling. “It’s the same argument for doing cloud computing but at the physical-infrastructure level,” he said. Mike Satter, vice president for OceanTech, a data-center-decommissioning service provider, says enterprises should absolutely outsource data-center construction or go the colo route. Just as there are contractors who specialize in building houses, there are experts who specialize in data-center design, he said. He added that with many data-center closures there is subsequent consolidation. “For every decommissioning we do, that same company is adding to another environment somewhere else. With the new hardware out there now, the servers can do the same work in 20 racks as they did in 80 racks five years ago. That means a reduced footprint and energy cost,” he said.


The Phantom Menace in Unit Testing

This is not a rant about unit testing; unit tests are critically important elements of a robust and healthy software implementation. Instead, it is a cautionary tale about a small class of unit tests that may deceive you by seeming to provide test coverage but failing to do so. I call this class of unit tests phantom tests because they return what are, in fact, correct results but not necessarily because the system-under-test (SUT) is doing the right thing or, indeed, doing anything. In these cases, the SUT “naturally” returns the expected value, so doing (a) the correct thing, (b) something unrelated, or even (c) nothing, would still yield a passing test. If the SUT is doing (b) or (c), then it follows that the test is adding no value. Moreover, I submit that the presence of such tests is often deleterious, making you worse off than not having them because you think you have coverage when you do not. When you then go to make a change to the SUT supposedly covered by that test, and the test still passes, you might blissfully conclude that your change did not introduce any bugs to the code, so you go on your merry way to your next task. 


British Airways facing £183m GDPR fine


The ICO said its investigation found that a variety of information was compromised by poor security arrangements at the company, including log in, payment card, and travel booking details as well name and address information. Information Commissioner Elizabeth Denham said: “People’s personal data is just that – personal. When an organisation fails to protect it from loss, damage or theft it is more than an inconvenience. That’s why the law is clear – when you are entrusted with personal data you must look after it. Those that don’t will face scrutiny from my office to check they have taken appropriate steps to protect fundamental privacy rights.” The ICO said BA has cooperated with the investigation and has made improvements to its security arrangements since the breach came to light. The company now has 28 days to make representations to the ICO about the findings of its investigation and the proposed fine. Willie Walsh, chief executive of BA owners International Airlines Group, has confirmed that the airline will make representations to the ICO, according to Reuters. “We intend to take all appropriate steps to defend the airline’s position vigorously, including making any necessary appeals,” he said.



How artificial intelligence can transform the legal sector


Firms are experimenting with the use of chatbots technology to deliver basic legal advice. DoNotPay has already garnered a lot of attention for allowing users to appeal parking fines and it’s not unreasonable to expect that as the technology becomes more sophisticated, higher quality and more specific advice could be offered by a similar machine learning (ML) tool. As with all new technologies, the ultimate aim is to ensure that firms offer a higher quality and more consistent service. It is important to note that AI is not positioned to outperform the high-end tasks performed by legal professionals, but should rather be seen as a tool designed to support them by carrying out time-consuming research or administrative tasks. That’s why now is the time for forward-thinking firms to begin integrating AI into their legal services as well as their administrative procedures. The use of AI does however give rise to a few practical and ethical considerations that legal teams must be aware of. Many revolve around the sensitive data that firms would be required to store on clients in order to offer an optimal service.


Think like a criminal to beat them at their own game ⁠— Frank Abagnale Jr

“Think like a criminal to beat them at their own game” ⁠— Frank Abagnale Jr image
Crime today, of course, has a significant physical element. However, over the last 20 years there has been a criminal movement towards the digital. Cybersecurity Ventures predicts cybercrime damages will cost the world $6 trillion annually by 2021, up from $3 trillion in 2015. The attack surface area is now different, but “the one thing that never changes is that criminals are all the same,” said Mr Abagnale. “So, if you think like a criminal, it doesn’t matter what they do, you can figure out there motives and means.” Over the course of a 43-year-career in the FBI, Mr Abagnale has worked on every single data breach, including; TJX in 2007, and more recently, the Marriott and Facebook breaches. “The one thing that I’ve learnt is that every breach occurs because somebody in that company did something that they weren’t supposed to do, or somebody in that company failed to do something they were supposed to do,” he said. “It always comes down to the human element,” reiterated Mr Abagnale.


IAM market evolves, but at a cost

"We're in this spot with a lot of technical debt," Daum said, adding that State Auto is a G Suite customer and is in the cloud with AWS, but is hesitant to add on another vendor just for identity management. "We're paying a lot of money to a lot of different companies and we're trying to find a way to see which of those companies can be used for identity services. No offense to Ping Identity or Okta, but why pay them however much money if we can limit the amount of cooks in the kitchen." Emerging capabilities within IAM products intrigued Daum, but never bested ROI. "Where's the value added?" Daum said. "Everyone is talking about cloud and password-less and zero trust. Those buzzwords sound nice, but the cost to implement is still huge." Zero trust is a security architecture introduced by Forrester Research that is designed to assess threats not just from outside the network, but from within it. It uses the principle "never trust, always verify" anything trying to connect to the network to ensure it remains secure.


Don’t wait up for the open cloud

Don’t wait up for the open cloud
Open clouds have been a concept since cloud computing became a thing; the reality is that we’re dealing with public companies that have to return an investment to shareholders. They operate based on gaining profitable advantages and working within their own market microcosms. They court users in their own way, pushing their own cloud services, which leads to having workloads that are not easily transported from cloud to cloud. Indeed, if the objective is “cloud native,” by definition that's going to mean lock-in. A few open cloud standards have been pushed in the past, and currently as well. Although they found traction as private clouds, with some public cloud instances as well, private clouds have declined relative to public clouds, and the public cloud instances shut down. It’s just too hard to keep up with the larger public cloud players and their billion-dollar R&D and marketing budgets. This leads me to a few conclusions about the state of cloud computing now, as well as some projections of where things are likely to go: The notion of interoperable public clouds is not likely to happen unless the user bases demand it and the public cloud providers feel the pinch.


Do and don'ts of navigating data analytics in the cloud

The marketing hype on the cloud positioning it as being an “easy button” can draw you in, but the reality is moving an enterprise data warehouse or another type of analytical environment to the cloud is just like moving one database platform to another – and it comes with the same challenges. You and your team need to be ready to migrate, monitor and test the new environment, and when you are migrating systems that have developed over time, “lifting-and-shifting” does not come without running into technology issues or making functional decisions that impact how a business or application is run. It’s true that with the cloud, you never have to complete low-level administration of your environment such as software updates and server sizing. However, higher-level administration such as database performance, usage analysis, cost management, and security and privacy management will always be a requirement. 


Must-have features in a modern network security architecture

Modern network security must have these features
As the old security adage goes, “the network doesn’t lie.” Since all cyber attacks use network communications as part of their kill chain, security analysts must have access to end-to-end network traffic analysis (NTA) up and down all layers of the OSI stack. The best NTA tools will supplement basic traffic monitoring with detection rules, heuristics, scripting languages, and machine learning that can help analysts detect unknown threats and map malicious activities into the MITRE ATT&CK framework. ... Network security technologies must support granular policies and rules, subject to immediate alteration based upon changes in things such as user location, network configuration, or newly discovered threats/vulnerabilities. Organizations must have the ability to spin up/spin down or change network security services whenever and wherever they are needed. Modern network security controls must be able to accommodate internet of things (IoT) devices and protocols with the same types of strong policies and enforcement as they offer for standard operating systems. Finally, network security architectures must be built around easily accessed APIs for rapid integration.


A Simplified Value Stream Map for Uncovering Waste

There are a number of ways to display waste in a system. The most common approach is probably the use of value stream maps. These are maps that show the journey of a product from raw material to finished goods delivered to customers. They are very helpful in understanding the flow of goods and pinpointing wasteful delays. These don’t always seem relevant to software engineering because the images of factories, trucks, and forklifts don’t apply. Even the versions developed specifically for software sometimes seem to lack the qualities of being simple and definitive. What if we just want to know one thing: for any given process, how much time is spent waiting versus working? This would give us a simplified view of waste for any process and would be helpful in making it more efficient. The details for constructing this are straightforward. Let’s define working as time spent actively creating a product, time for which customers would gladly pay. Let’s define waiting as time spent waiting on something, time for which customers would not want to pay. We use duration (not effort) for both and we maintain consistent time units between them.



Quote for the day:


"The one nearest to the enemy is the real leader." -- Ugandan Proverb


No comments:

Post a Comment