The security function within an organization is one of the most misunderstood parts of an organization and I tip my hat those that recognize the benefits of a finely tuned security function. For those in a security function, the following is not something new, but rather an affirmation that a competent, proactive security leader(s) can wear many hats in an organization in support of their objective of preventing threats to organizational assets, and moreover responding to events with the goal of minimizing the recovery time and impacts to the organizations brand and image. Beyond the multi-disciplinary responsibilities that a security leader has, they will be the person that everyone in the organization will seek guidance and direction from during a crisis.
In the survey, 79% of executives agree that AI will help accelerate technology adoption throughout their organizations. ... The Accenture authors cite a prime example of where AI is making its first inroads into enterprise UI and UX: voice-activated systems. "Advances in natural language processing and machine learning make technology more intuitive to use, like telling virtual assistants to schedule a meeting instead of accessing scheduling software to find a time, create an event,and type the details," they state. "AI already plays a variety of roles throughout the user experience. At the simplest level, it curates content for people, like the mobile app Spotify suggesting new music based on previous listening choices. In a more significant role, AI applies machine learning to guide actions toward the best outcome."
Today there are vibrant online marketplaces where just about anyone—even those with very limited technical knowhow—can buy tools to execute an attack. Cryptographic currencies enable untraceable digital payments, while old-fashioned economics is driving the growth of these marketplaces. Demand for services now outpaces supply, and DDoS-as-a-Service providers can bring in more than $100,000 annually. Purchasing an attack can be surprisingly inexpensive. On the Clearnet, for as little as $19.99 a month, an attacker can run 20-minute bursts for 30 days utilizing a number of attack vectors like DNS, SNMP, SYN and slow GET/POST application-layer DoS attacks. All an attacker has to do is create an account, select a plan, pay in Bitcoin and access the attack hub to target the victim by port, time and method.
In what threatens to become an unpleasant trend, uninsured disruptions and other business fallout from these attacks are increasingly cited as key factors in disappointing earnings reports. Cybersecurity is becoming a ratings boon for CNBC and other media outlets that report on stock markets and financials. More than a few security professionals are feeling the heat, however, as the industry is forced to take a closer look at the up-leveling of age-old deployment methods, like spear-phishing emails and internet worms, which don't require human interaction to spread. What else could go wrong? Plenty. With HTTPS deployments on the rise, researchers from top universities and technology companies like Google have joined forces to document growing concerns about the risks associated with traditional man-in-the-middle defenses using Transport Layer Security (TLS) interception.
The computer-enhanced disinformation campaigns launched by Russia and others are fairly crude, and the effort to cover their tracks limited. The future of disinformation is likely to be much more sophisticated and harder to defend against. Disinformation is rapidly going multimedia, for example. Advances in A.I. and CGI will enable convincing audio and video that can make it appear that anyone is saying or doing anything. University of Washington researchers used A.I. to create a fake videoshowing former president Barack Obama saying things he never actually said. And Stanford researchers developed something they call Face2Face, which creates real-time faked video, so basically anybody can be shown to say anything in a live video chat. These techniques aren't perfect. But given time and better technology, they will be.
Even so, given just how dependent Microsoft increasingly is on Linux, it’s time for the company to not just innovate around the edges of the Linux ecosystem but to start contributing directly to the Linux kernel, commensurate with the value it derives therefrom. Ten years ago, Microsoft couldn’t do this without suspicion. Today, this is what we expect of Microsoft. Microsoft seems to understand this ... Without fanfare, Microsoft has started hiring Linux kernel developers like Matthew Wilcox, Paul Shilovsky, and Stephen Hemminger. Hemminger’s hire is particularly interesting not only because he’s considered one of the big-time kernel developers, but also because it was he back in 2009 who called out Microsoft for violating the GPL in its Hyper-V code.
Peeved about previous vulnerability disclosures experiences with D-Link, a security researcher has publicly disclosed 10 zero-day vulnerabilities in D-Link DIR 850L wireless AC1200 dual-band gigabit cloud routers. Security researcher Pierre Kim opted to publicly disclose the vulnerabilities this time, citing a “very badly coordinated” disclosure with D-Link in February; that time around he had reported nine vulnerabilities, but he said it took D-Link five months to release new firmware which ended up patching only one the flaws he found. Overall, Kim thinks D-Link 850L routers are “badly designed” as “basically, everything was pwned, from the LAN to the WAN. Even the custom MyDLink cloud protocol was abused.”
Traditionally, network provisioning and configuration management are manual, error-prone processes. Network virtualization enables the creation of networks in software, abstracted from the underlying physical hardware. IT can provision networks quickly, with network and security services attached to workloads using a policy-driven approach. Automation takes things to the next level; network functions, including managing bandwidth, load balancing, and performing root cause analysis, are provisioned automatically based on predefined policies. To eliminate the network bottleneck at the University of New Mexico, it deployed VMware’s NSX network virtualization platform and vRealize Automation cloud automation software.
The chaotic nature of IT transformation may also be a factor, as agencies attempt to simultaneously operate in two worlds: the old IT environment and the new. Tony Sager, senior vice president at the Center for Internet Security, said it's already challenging for federal IT leaders to meet the regulatory reporting requirements of the status quo without the "trauma" that comes from most large scale IT transformations. "Where I see people stressed is if they see old requirements they're stuck dealing with while trying to change the IT infrastructure at the same time," said Sager. Of those who said their security issues have increased, more than half (53 percent) cited their IT staff's difficulties supporting and completing the transition from old technologies to new. Increased compliance reporting was the second-most frequently cited reason
Super Micro designs and assembles server components, such as network cards, storage interfaces and CPUs. For the Super Micro operating system to run on modern systems, it interfaces with the BIOS and firmware and, many times, the BIOS and firmware include significant functionality. These tools could be from an additional third-party contracted by the manufacturer. The BIOS and firmware may need to be updated, and can still be compromised. Despite being custom servers, many of the server's components are similar to that of mass market servers, and Super Micro uses similar firmware and drivers to keep costs low. As more third parties get involved, the server attack surface grows larger, and responsibility for hardware security of the finished product gets less clear.
Quote for the day:
"Life is too short to waste time waiting for other people's approval on how you live it." -- Steve Maraboli