Whatever backup solution you choose, copies of backups should be stored in a different location. This means more than simply putting your backup server in a virtual machine in the cloud. If the VM is just as accessible from an electronic perspective as it would be if it were in the data center, it’s just as easy to attack. You need to configure things in such a way that attacks on systems in your data center cannot propagate to your backup systems in the cloud. This can be done in a variety of ways, including firewall rules, changing operating systems and storage protocols. ... If your backup system is writing backups to disk, do your best to make sure they are not accessible via a standard file-system directory. For example, the worst possible place to put your backup data is E:\backups. Ransomware products specifically target directories with names like that and will encrypt your backups. This means that you need to figure out a way to store those backups on disk in such a way that the operating system doesn’t see those backups as files. For example, one of the most common backup configurations is a backup server writing its backup data to a target deduplication array that is mounted to the backup server via server message block (SMB) or network file system (NFS).
“The role of the CFO has further evolved beyond serving as the finance lead to becoming a ‘digital steward’ of their organization. Increasingly, CFOs are focused on collecting and interpreting data for key business decisions and enabling strategy beyond the borders of the finance function,” said Christian Campagna, Ph.D., senior managing director and global lead of the CFO & Enterprise Value practice at Accenture. “Faced with new challenges spurred by the pandemic, today’s CFOs must execute their organizations’ strategies at breakthrough speeds to create breakout value and success that can be realized across the enterprise.” The report identifies an elite group (17%) of CFOs who have transformed their roles effectively, resulting in positive changes to their organizations’ top-line growth and bottom-line profitability. ... increasingly, companies are looking to CFOs to spearhead thinking around future operating models and drive the technology agenda forward with a focus on security and ESG. In fact, 68% of surveyed CFOs say that finance takes ultimate responsibility for ESG performance within their enterprise. However, 34% specifically cited concern about data and privacy breaches as a barrier preventing them from realizing their full potential as a driver of strategic change.
Data meets science: Open access, code, datasets, and knowledge graphs for machine learning research and beyondReproducibility is a major principle of the scientific method. It means that a result obtained by an experiment or observational study should be achieved again with a high degree of agreement when the study is replicated with the same methodology by different researchers. According to a 2016 Nature survey, more than 70% of researchers have tried and failed to reproduce another scientist's experiments, and more than half have failed to reproduce their own experiments. This so-called reproducibility or replication crisis has not left artificial intelligence intact either. Although the writing has been on the wall for a while, 2020 may have been a watershed moment. That was when Nature published a damning response written by 31 scientists to a study from Google Health that had appeared in the journal earlier. Critics argued that the Google team provided so little information about its code and how it was tested that the study amounted to nothing more than a promotion of proprietary tech. As opposed to sometimes obscure research, AI has the public's attention and is backed and capitalized by the likes of Google. Plus, AI's machine learning subdomain with its black box models makes the issue especially pertinent. Hence, this incident was widely reported on and brought reproducibility to the fore.
ICMCP and Women in CyberSecurity (WiCyS) announced that they will work with Target this spring to expand access to the National Cyber League (NCL) virtual competition and training program for 500 women and BIPOC individuals as a way to introduce cybersecurity and technology careers to more underrepresented students. The competition gives participants a chance to tackle simulated real-world scenarios as a way to sharpen their cybersecurity skills, explore areas of career specialization, and boost their resume. Target CISO Rich Agostino said the opportunity for his company to participate fit with its long-standing efforts to increase the diversity of its workforce and the technical professions, too. For example, Agostino has a formal mentoring program, pairing women on his team with outside executives. “I’m a huge believer that if you want to make a difference in someone’s career, you get them connected with the right people to build their network,” he says. Target, which is headquartered in Minneapolis, also works with the University of Minnesota through various programs, such as scholarships and networking opportunities, to help increase diversity among the students and, thus, the future workforce.
Firstly, machine learning processes need to be explainable. With the vast majority of models being trained by human employees, it’s vital that users know the information it needs to provide for the goal of usage to be reached, so that alerts of any anomalies can be as accurate as possible. Samantha Humphries, senior security specialist at Exabeam, said: “In the words of Einstein: ‘If you can’t explain it simply, you don’t understand it well enough’. And it’s true – vendors are often good at explaining the benefits of machine learning tangibly – and there are many – but not the process behind it, and hence it’s often seen as a buzzword. “Machine learning can seem scary from the outset, because ‘how does it know?’ It knows because it’s been trained, and it’s been trained by humans. “Under the hood, it sounds like a complicated process. But for the most part, it’s really not. It starts with a human feeding the machine a set of specific information in order to train it. “The machine then groups information accordingly and anything outside of that grouping is flagged back to the human for review. That’s machine learning made easy.” Mark K. Smith, CEO of ContactEngine, added: “Those of us operating in an AI world need to explain ourselves – to make it clear that all of us already experience AI and its subset of machine learning every day.
Graph databases are a key pillar of this new order. They provide APIs, languages, and other tools that facilitate the modeling, querying, and writing of graph-based data relationships. And they have been coming into enterprise cloud architecture over the past two to three years, especially since AWS launched Neptune and Microsoft Azure launched Cosmos DB, respectively, each of which introduced graph-based data analytics to their cloud customer bases. Riding on the adoption of graph databases, graph neural networks (GNN) are an emerging approach that leverages statistical algorithms to process graph-shaped data sets. Nevertheless, GNNs are not entirely new, from an R&D standpoint. Research in this area has been ongoing since the early ‘90s, focused on fundamental data science applications in natural language processing and other fields with complex, recursive, branching data structures. GNNs are not to be confused with the computational graphs, sometimes known as “tensors,” of which ML/DL algorithms are composed. In a fascinating trend under which AI is helping to build AI, ML/DL tools such as neural architecture search and reinforcement learning are increasingly being used to optimize computational graphs for deployment on edge devices and other target platforms.
Quote for the day:
"Leadership is the creation of an environment in which others are able to self-actualize in the process of completing the job." -- John Mellecker