"This is part of the learning curve," said Deepak Mohan, an analyst with IDC. "The negatives are attributed to the cloud and not to these mistakes that need to be corrected... If a company does not realize the cost savings and they fail to see the results they thought they'd get, the result is that there is a drop in faith and a lowering of confidence in your cloud strategy. And that will cause a slowdown in adoption." Part of the issue is that the cloud is really a different beast for a lot of IT shops. ... "What we learned is that while it's easy to get started, cloud is completely different from IT," said Temujin Baker, senior manager of engineering and architecture for King County, Wash. "How you run your business in the cloud is different than how you run it" on premises. "There are changes in how you do your work, the skills that are needed, the process."
Software archiving is nothing new, from organizations such as Archive.org, Bitsavers.org, the federal government's National Software Reference Laboratory, and many smaller players who've all been working for years to post applications online for public download or at least for browser-based emulation. It never was easy, and now it's becoming more difficult. Preservationists are joining resources because they realize that programs are going cloud-native, upgrades are increasingly transparent to users, and how do you take snapshots of a program that's reliant on constantly changing infrastructure? "The Software Preservation Network (SPN), we make no claims that we're the first people," noted SPN's Jessica Meyerson, a digital archivist at the University of Texas at Austin. "Many archivists, information professionals, and just individuals... have become the caretakers and maintainers of legacy software just because they see the value in doing so. "
A cybersecurity best practice is to always avoid becoming the low-hanging fruit, and by making hackers work just a little bit harder your property could avoid a potential digital break-in. While Rodriguez likely had insider knowledge of Marriott’s internal systems and processes, it’s possible that following proper security protocol could have prevented, or deterred, his actions. At the recent Serviced Apartment Summit Americas event, hosted April 11 at the New York Marriott Downtown, Matthew Baker, senior associate at Katten Law, said data breaches in hotels are increasing in number and sophistication. Baker said one of the biggest threats to hotel security is vulnerabilities found in third-party contractors, and called for better and more thorough vetting before entering into digital partnerships.
“Attackers can simply move to different techniques – for example non-malware attacks that do not use binaries but scripts or macros – which are much harder to train/learn from an AI/ML perspective. Any preventative technology that relies on the classification of good or bad is always susceptible to the arms race,” he said. Reza Chapman, managing director of cybersecurity in Accenture’s health practice, said maintaining the effectiveness of AI/ML can require significant maintenance. “Detection thresholds need to be adjusted to reach a balance between false alarm rate and missed detection rate,” he said. “Further, constant tuning is often necessary within the specific operation environment. Overall, this is not a reason to steer away from these technologies. Instead, consider AI and ML as complementary to the personnel in your security program.”
Perhaps the most important thing to realize is that technology alone will never solve the problem. Perfect email filters will cause the bad guys to use the phone. Perfect phone filters lead them to target peoples’ personal social media accounts. Close one door and they will find another—it’s not unlike those movies where the thief always gets the loot or the painting, no matter how many layers of security are employed. But there is something you can do about it. “Training and education has to be is part of the solution to make people aware of these attacks, how they can detect, stop and report them,” wrote Sjouwerman. End-user Internet Security Awareness Training is all about teaching users not to do silly things like clicking on suspect URLs in emails, or opening attachments that let in the bad hats. Sjouwerman recommended putting all staff through such training.
"CLDAP reflection works in the same way as any other UDP-based reflection attack," Arteaga says. "[But] the amplification of the response is impressive compared to most other vectors," he says. On average, Akamai observed CLDAP-enabled DDoS attacks achieving amplifications of over 56%. The largest attack using CLDAP as the sole vector that Akamai has mitigated so far had a peak bandwidth of 24 Gigabits per second, or about two million packets per second. "These attacks are averaging around 3 gigabits per second—a pretty impressive number considering the limited number of available reflectors," ... CLDAP uses the User Datagram Protocol (UDP) instead of the Transmission Control Protocol (TCP) for communication. UDP does not validate source IP addresses, thereby making application-layer protocols that rely on it—such as CLDAP—good vectors for launching DDoS attacks.
A lot of time people will use ThingWorx in the factory, collecting information from sensors and controllers and various other pieces of hardware. ThingWorx is a great tool for aggregating that information. But it can also bring in data from other digital resources, such as CAD and PLM and even ERP. Really what all of this is about is allowing you to create a digital twin of what’s happening out there in the physical world. You’ve got some smart, connected product in the world, you want to be able to have a digital equivalent of it so you can understand how it’s being operated, predict when it’s going to fail, make sure it’s operating most efficiently. The digital twin is getting more and more airplay. What goes into the digital twin? Ideally it’s everything you would ever want to know about that thing. In practical use cases, what do you care about? A digital twin could be a set of properties and their current attributes. It could be rich 3D information.
For years, Apple has touted "thinner and lighter" as the hallmarks of each new release of the iPhone. With the Galaxy S8, Samsung has beat Apple to the punch in a big way. The S8+ makes the iPhone 7 Plus and the Google Pixel XL feel like cinder blocks by comparison. With an almost bezel-less screen that covers 83% of the front of the phone, the S8+ has a 6.2-inch screen that offers extra functional real estate at the same time. Again, it makes the iPhone 7 Plus and the Pixel XL look outdated by comparison. ... Going into testing the Galaxy S8 Plus, my biggest concern was battery life since the S8 Plus actually has a slightly smaller battery than last year's Galaxy S7 Edge (and that phone sometimes struggles to get through a full day). However, because of the S8's new 10nm processor and some nice power management features in the software, the S8 has ridiculously good battery life.
How can you calculate the value of a data breach in your organisation and implement an effective mitigation strategy? This is the question that Todd Forgie answers in his fascinating presentation, originally delivered at CLOUDSEC Singapore. You can either read the summary below, which includes Forgie's key recommendations, or watch the video at the end of the article. Forgie is the Vice President of IT and Managed Services at MEDHOST, a healthcare IT company in the US that services about 25% of hospitals in the US and Puerto Rico. He explains that due to the huge growth in the estimated number of successful cyber-attacks and ransomware, it's now critical for organisations to operate with the assumption of a breach. ... But in order to make this happen, his organisation had to accurately model the level of risk being faced and the value attributed to that risk.
Moderate- to high-risk actions should not be automated. Start by examining what could go wrong if an automated action is taken incorrectly. Generate as many scenarios as possible to try to discover circumstances in which the action could cause damage. From my experience, remediation steps like blocking IPs or disconnecting users should not be automated. Scale must also be considered when deciding which processes are appropriate for automation. For example, running a tool against a single system to gather some information can be automated when performed on a small scale, but if the same action is run on thousands of hosts, it can have a detrimental effect on the network stability. For these types of actions, set thresholds for type, amount and time frame of automated activities to protect the network.
Quote for the day:
"The greatest single human gift - the ability to chase down our dreams." -- Prof. Hobby