Can Continuous Intelligence and AI Predict the Spread of Contagious Diseases?
Past efforts to model the spread of contagious diseases may have made false assumptions about the data they relied on? Does the fact that many people in one geographic region search for the name of an emerging contagious disease mean the disease is present and growing? Perhaps, perhaps not. The danger is relying on coincidences and not linking cause to effect. Did past and current efforts have all the data they needed? One issue with forecasting the spread of a disease is that models might not have accurate data. The issue is especially relevant at the onset of new diseases. It is quite easy to blur flu-like symptoms in patients. Doctors may not know the symptoms of a disease at its onset, or they may make inaccurate diagnoses. Are the models based on the right science? At the early stage of investigating a newly found disease, even basic information, like how a disease spreads, is unknown. Is it airborne? Does it spread via exposure to blood or other bodily fluids? What’s the incubation period? Such mechanisms need to be nailed down before predictions can be made.
Out at Sea, With No Way to Navigate: Admiral James Stavridis Talks Cybersecurity
We're still figuring out how this is going to work. To shift metaphors to the oceans, it's as though we're out at sea, we're in a bunch of boats, but we haven't really put in place buoys and navigational aids, and we haven't really defined who's going to protect us. So if if I'm a commercial ship at sea, I know the US Navy is going to come and defend me if I'm an American ship and I'm under attack. And in fact, we actively discourage merchant ships from mounting their own defenses. The defense requirements, I think, ought to be vested in the state. But in the world of cyber, realistically, if you're a commercial entity, particularly a target-rich kind of environment like financials or critical infrastructure, say electric grid, the government so far has not really stepped up to that task of broadly protecting you. Yeah, you can get some help from the NSA and some help from the FBI and some help from the CIA. But broadly speaking, you are going to have to have some mechanisms, at least on the detection and on the defensive side.
Containers march into the mainstream
A decade ago, Solomon Hykes’ invention of Docker containers had an analogous effect: With a dab of packaging, any Linux app could plug into any Docker container on any Linux OS, no fussy installation required. Better yet, multiple containerized apps could plug into a single instance of the OS, with each app safely isolated from the other, talking only to the OS through the Docker API. That shared model yielded a much lighter weight stack than the VM (virtual machine), the conventional vehicle for deploying and scaling applications in cloudlike fashion across physical computers. So lightweight and portable, in fact, that developers could work on multiple containerized apps on a laptop and upload them to the platform of their choice for testing and deployment. Plus, containerized apps start in the blink of an eye, as opposed to VMs, which typically take the better part of a minute to boot. To grasp the real impact of containers, though, you need to understand the microservices model of application architecture. Many applications benefit from being broken down into small, single-purpose services that communicate with each other through APIs, so that each microservice can be updated or scaled independently.
Democratizing data, thinking backwards and setting North Star goals
Essentially, database is a fairly old technology, but it has always been about three things. One thing is value. How do you get the best out of your data, which is, what are the features that you provide, the power of querying the data, of updating it, of correlating it, and doing things with the data? The second thing has been security. How do you make sure that the data stays under your control, that you own it and determine what happens with the data? And the third is, I would call it cost or performance, is making sure that you don’t overpay for the data, right? That it’s kind of cheap to, or kind of gets more and more affordable, to do what you want to do with your data and control it. ... The best way to process data is if it’s really structured and you know exactly what it is, right? And you have a schema, essentially. And I spent a lot of time working on semi-structured data, which has some structure that you kind of extract and that is kind of like getting good value out of all data, not just your structured data like your bank accounts, but also your email, the books you write, the word documents you write, getting some value out of that.
Artificial intelligence and machine learning an essential part of cybersecurity
World Wide Technology also plans to use AI and ML this year as part of its cybersecurity plans, according to chief technology advisor Rick Pina. "In today's digital age, the security of data, applications, and processes is of the utmost importance; and AI and ML now play an integral part in this cybersecurity process. AI and ML have brought enticing new prospects for speed, accuracy, and connectivity to the public and private sectors, allowing government agencies and corporate organizations to make great strides in governed self-service access, alongside data security and reliability," Pina said. ... Michael Hanken, vice president of IT at Multiquip, said he isn't planning to use AI and ML yet, but he is researching its benefits and limits to see how it might work in conjunction with cybersecurity in the future. Dan Gallivan, director of IT for Payette, said, "AI and ML are not part of the official plan this year but I do feel they are in the not too distant future as we learn more about artificial intelligence and machine learning development capabilities and then experiment with them in cybersecurity."
7 Cloud Attack Techniques You Should Worry About
As organizations transition to cloud environments, so too do the cybercriminals targeting them. Learning the latest attack techniques can help businesses better prepare for future threats. "Any time you see technological change, I think you certainly see attackers flood to either attack that technological change or ride the wave of change," said Anthony Bettini, CTO of WhiteHat Security, in a panel at last week's RSA Conference. It can be overwhelming for security teams when organizations rush headfirst into the cloud without consulting them, putting data and processes at risk. Attackers are always looking for new ways to leverage the cloud. Consider the recently discovered "Cloud Snooper" attack, which uses a rootkit to bring malicious traffic through a victim's Amazon Web Services environment and on-prem firewalls before dropping a remote access Trojan onto cloud-based servers. As these continue to pop up, many criminals rely on tried-and-true methods, like brute-forcing credentials or accessing data stored in a misconfigured S3 bucket. There's a lot to keep up with, security pros say.
Robotic Process Automation Implementation Choices
The first step in implementing RPA is identifying tasks that lend themselves to automation. There are some common characteristics to look for even though RPA application areas cut across broad swaths of organizations. Specifically, IBM notes that an “RPA-ready” application is one that is: Simple, consistent, and repeatable; Repetitive low-skill tasks that create human issues such as high error rates and low worker morale; Existing or planned processes where stripping off routine tasks can free humans and deliver significant productivity, efficiency, or cost benefits; and Tasks that offer meaningful opportunities to improve customer and worker experiences by speeding up existing processes. Some tasks may meet many of these criteria but still not be suitable for RPA. For example, a task may meet every criterion, but if the task requires additional data capture capabilities or a redesign of the process, RPA may not be the right fit. RPA can be applied to a very broad range of tasks across most industries.
Android security warning: One billion devices no longer getting updates
All of the phones in the tests were infected successfully by Joker – also known as Bread – malware. Every single device tested was also infected with Bluefrag, a critical vulnerability that focuses on the Bluetooth component of Android. Which? said there should be greater transparency around how long updates for smart devices will be provided so that consumers can make informed buying decisions, and that customers should get better information about their options once security updates are no longer available. The watchdog also said that smartphone makers have questions to answer about the environmental impact of phones that can only be supported for three years or less. Google told ZDNet: "We're dedicated to improving security for Android devices every day. We provide security updates with bug fixes and other protections every month, and continually work with hardware and carrier partners to ensure that Android users have a fast, safe experience with their devices." When operating systems and security updates are delivered varies depending on the device, manufacturer and mobile operator. Because smartphone makers will tweak bits of the Android operating system, they often deploy patches and updates at a slower pace than Google does on its own devices, or not at all.
The Dark Side of Microservices
From a technical perspective, microservices are strictly more difficult than monoliths. However, from a human perspective, microservices can have an impact on the efficiency of a large organization. They allow different teams within a large company to deploy software independently. This means that teams can move quickly without waiting for the lowest common denominator to get their code QA’d and ready for release. It also means that there’s less coordination overhead between engineers/teams/divisions within a large software engineering organization. While microservices can make sense, the key point here is that they aren’t magic. Like nearly everything in computer science, there are tradeoffs — in this case, between technical complexity for organizational efficiency. A reasonable choice, but you better be sure you need that organizational efficiency, for the technical challenges to be worth it. Yes, of course, most clocks on earth aren’t moving anywhere near the speed of light. Furthermore, several modern distributed systems, rely on this fact by using extremely accurate atomic clocks to sidestep the consensus issue.
Essential things to know about container networking
Choosing the right approach to container networking depends largely on application needs, deployment type, use of orchestrators and underlying OS type. "Most popular container technology today is based on Docker and Kubernetes, which have pluggable networking subsystems using drivers," explains John Morello, vice president of product management, container and serverless security at cybersecurity technology provider Palo Alto Networks. "Based on your networking and deployment type, you would choose the most applicable driver for your environment to handle container-to-container or container-to-host communications." "The network solution must be able to meet the needs of the enterprise, scaling to potentially large numbers of containers, as well as managing ephemeral containers," Letourneau explains. The process of defining initial requirements, determining the options that meet those requirements, and then implementing the solution can be as important choosing the right orchestration agent to provision and load balance the containers. "In today's world, going with a Kubernetes-based orchestrator is a pretty safe decision," Letourneau says.
Quote for the day:
"Leadership without mutual trust is a contradiction in terms." -- Warren Bennis
No comments:
Post a Comment