Daily Tech Digest - November 09, 2018

Cisco Accidentally Released Dirty Cow Exploit Code in Software

“A failure in the final QA validation step of the automated software build system for the Cisco Expressway Series and Cisco TelePresence Video Communication Server (VCS) software inadvertently allowed a set of sample, dormant exploit code used internally by Cisco in validation scripts to be included in shipping software images,” the company said in an advisory. “This includes an exploit for the Dirty CoW vulnerability (CVE-2016-5195). The purpose of this QA validation step is to make sure the Cisco product contains the required fixes for this vulnerability.” Cisco said that it is not aware of “malicious use of the issue” and that the issue does not open the impacted software (Cisco Expressway Series and Cisco TelePresence Video Communication Server image versions X8.9 through X8.11.3) to any sort of attack. “The impacted software images will be removed and will be replaced by fixed images,” the company said. It did not specify when.

The Role of a Manager Has to Change in 5 Key Ways

“First, let’s fire all the managers” said Gary Hamel almost seven years ago in Harvard Business Review. “Think of the countless hours that team leaders, department heads, and vice presidents devote to supervising the work of others.” Today, we believe that the problem in most organizations isn’t simply that management is inefficient, it’s that the role and purpose of a “manager” haven’t kept pace with what’s needed. For almost 100 years, management has been associated with the five basic functions outlined by management theorist Henri Fayol: planning, organizing, staffing, directing, and controlling. These have become the default dimensions of a manager. But they relate to pursuing a fixed target in a stable landscape. Take away the stability of the landscape, and one needs to start thinking about the fluidity of the target. This is what’s happening today, and managers must move away from the friendly confines of these five tasks.

Cloud, edge, and fog computing: understanding the practical application for each

Cloud, edge and fog computing image
Fog computing effectively “decentralises” computational and analytical power. It sits between your local equipment and mobile devices — equipment with limited processing power and storage, in other words — and provides a way to sift through streams of information from these and other components in your IoT. You can get a better mental image of fog computing by thinking about driverless automobiles navigating a city block. If the vehicles, their sensors, and their controllers are the “edge layer” for a city’s smart transportation system — we’ll get to edge computing in a moment — then there are likely micro-data centres alongside mesh routers and cell towers that serve as the “fog layer.” Fog computing isn’t quite as decentralised as the edge, but it does further reduce the amount of data transferred across the network or upwards into the cloud layer. It facilitates communication and collaboration between the “nodes” in the edge layer. In the example above, the nodes are the driverless cars.

Don’t make your cloud migration a house of cards

Don’t make your cloud migration a house of cards
The biggest architectural mistake that I see in the cloud involves coupling. Back in the day, applications were tightly coupled between other applications and data sources. If one thing stopped, the entire system stopped. So if the database went down, all connected applications did as well, including any systems that sent or received data from the database. Years ago, we learned that tight coupling was bad. It killed resiliency, scalability, and the ability to independently use resources such as applications, databases, and queues. Consultants like me gave presentations on it, and books were published on the topic, but IT organizations are still making the same architectural mistakes in 2018 that diminish the value of cloud computing. IT is not fixing things that are moving to the cloud that need fixing. At the core of the issue is money. Enterprises do not allocate enough funding to fix these issues before they move to the cloud. I assume the hope is that the issues won’t be noticed, or that the use of a more modern platform will magically fix the issues despite their poor architectures. 

deepfakes fake news tv head manipulation superimposed brainwashed
Seeing is believing, the old saw has it, but the truth is that believing is seeing: Human beings seek out information that supports what they want to believe and ignore the rest. Hacking that human tendency gives malicious actors a lot of power. We see this already with disinformation (so-called "fake news") that creates deliberate falsehoods that then spread under the guise of truth. By the time fact checkers start howling in protest, it's too late, and #PizzaGate is a thing. Deepfakes exploit this human tendency using generative adversarial networks (GANs), in which two machine learning (ML) models duke it out. One ML model trains on a data set and then creates video forgeries, while the other attempts to detect the forgeries. The forger creates fakes until the other ML model can't detect the forgery. The larger the set of training data, the easier it is for the forger to create a believable deepfake. This is why videos of former presidents and Hollywood celebrities have been frequently used in this early, first generation of deepfakes — there's a ton of publicly available video footage to train the forger.

The creation of one code base that is easy to maintain and publishes well across multiple OSes is no easy feat, said Jonathan Marston, director of software at Optimus Ride, a self-driving car company in Boston. Tools such as Adobe Air have tried and failed to achieve it, he said. "In the past, that dream has never lived up to the reality," Marston said. The ability to share code across multiple mobile OSes is getting more attainable with tools such as NativeScript and React Native, but the particular idiosyncrasies of each OS make it difficult to achieve complete code sharing, said Jesse Crossen, lead developer of VoiceThread, an education software company in Durham, N.C. For example, developers might want to write one set of code for an iOS visual component and another for an Android visual component, due to different screen sizes and resolutions. "You're always going to have that level of customization per platform or have [an app] that's a little bit generic," Crossen said.

While IoT is generally thought of in terms of consumer products, he pointed out that some IoT systems are widely used in the business context such as building management systems that control the heating, cooling, door locks and fire alarms. “It is important that businesses think about the IoT devices they have in their environments. The gap between IT and services often creates opportunities for technology to cause problems, and so there are some key questions businesses need to ask suppliers, retailers, hardware manufacturers so you know whether you are buying a good product or one full of security vulnerabilities.” Munro said he was able to buy a controller of a business management system online and was able to find vulnerabilities that could be exploited to discover the password of the embedded server that would enable an attacker to take complete control of the building management system.

Microsoft: .NET Core Is the Future, So Get Moving

"As we move forward into the future, with .NET Core 3, we're going to see some more workloads that we're going to be working on here, mainly Windows desktop," Massi said. "We're bringing Windows desktop workloads to .NET Core 3, as well as AI and IoT scenarios. "The big deal here is now that if you're a WinForms or WPF developer you can actually utilize the .NET Core runtime." It's still Windows, she said. It's still your Windows application framework for desktop apps, but developers will be able to take advantage of the .NET Core feature set, such as improved performance, side-by-side installs, language features and other innovations being made in the platform itself. "So that's kind of a big deal," Massi said. While .NET Core is about improved performance, self-contained .exe files for desktop application deployment flexibility and more, it also provides UI interop. "It's about, instead of totally rewriting your apps to take advantage of Windows 10 or more modern UI controls, we're making it so that you can use modern UI controls in WinForms and WPF -- that's what UI interop is," Massi said.

10 signs you may not be cut out for a systems analyst position

The ability to say "No" is important in managing all areas of life, but as a systems analyst, someday your job may depend on it. Suppose you're in a meeting with your boss, their boss, and management from the operations side. Someone tries to get you to commit, on the spot, to adding new functionality, and your boss is not interceding for you. Under pressure, many people would say "Yes" just to get out of the meeting. But if you don't know absolutely that you can do the project, within the time and budget required, resist the temptation to get them off your back temporarily. Agreeing to a task that turns out to be unreasonable is just a setup for failure. ... Saying "No" may prevent you from promising the impossible, but it's best to use the word sparingly. To succeed as a systems analyst, you'll need to think of yourself as an in-house consultant. The business needs IT tools to make money, and you have to figure out how to provide those tools. Work with your in-house customers to develop a plan you can say "Yes" to. Figure out what you need—more time, more money, more human or technical resources—and be prepared to back up your requests.

The security skills shortage: A golden opportunity for creative CISOs

The very shallow security skills talent pool has also led to another opportunity, one that serves to up-skill and empower in-house (and even outsourced) development teams. It is a known fact that most of the world’s highest-scale security breaches were made possible due to errors in the software code itself, and with the average breach costing in excess of US$3.6 million, it makes sense to examine the application security budget. It stands to reason that if developers remain untrained, the same mistakes will be made year after year, and the same reactive, expensive after-the-fact fixes will need to be applied. It seems a crazy way to burn through cash, all while an organization’s reputation as a security-conscious company goes down the drain. So, why not change it up and secure software from the start of production? Empowering development teams to write secure code is the golden opportunity for CISOs to seize proactive control over looming security issues, and where there is the chance for fast, easy and measurable improvements – for both security and development teams.

Quote for the day:

"Perhaps the ultimate test of a leader is not what you are able to do in the here and now - but instead what continues to grow long after you're gone" -- Tom Rath

No comments:

Post a Comment