Two-factor authentication, or 2FA, has been sold to web users as one of the most important and trustworthy tools for securing your digital life. You probably know how it works: By supplying an account with not just your password but also a secondary piece of information (typically an automated code texted to your phone or device of choice), companies can verify that whoever signs into your account is definitely you and not just some goon who’s managed to get their hands on your personal information. However, according to new research, said goons have unfortunately found a number of effective ways to get around your 2FA protections—and they’re using these methods more and more. The study, put out by academic researchers with Stony Brook University and cybersecurity firm Palo Alto Networks, shows the recent discovery of phishing toolkits that are being used to sneak past authentication protections. Toolkits are malicious software programs that are designed to aid in cyberattacks.
Digitisation has been a hot topic in the last few years. We are now seeing rapid adoption of HR technology right from “Hire to Retire”. Automation in HR has now become a reality that has nearly touched all aspects of human resource management. As we move forward, the expectation would be to have all HR/Employee services on the palms of an Employee. Digitisation would enable to overcome HR challenges and improve business efficiency and workplace productivity. While most of the things will move to Digital, many elements in HR (of being humane) cannot be replaced. Teams will become more complex, and employees would therefore need more human attention and empathy. AI will continue to help Organisation in a lot of smart ways, but when it comes to physical and mental well-being concerns, HR will have to continue to tackle employee issues more emphatically. Retaining the talent was always one of the key priorities but now the focus would be more to keep employees happy and willing to stick around. Covid outbreak has forced the workforce to reprioritise their career and choose employers/jobs which fits into their new priorities.
Automation has been with us for centuries, but its application in the digital age is something entirely new. We're seeing it creep into almost every aspect of our lives, with far-reaching and never-before-seen results. Take, for instance, a sporting event such as the 2021 Formula 1 Heineken Dutch Grand Prix, where fans could order tickets, procure refreshments, secure purchases and get all their questions answered through a chatbot that runs through WhatsApp. With the help of data reporting and features such as suggested answers, internal collaboration and skill-based routing, the Dutch Grand Prix could interact with fans efficiently and effectively. This was made possible by deploying a customer data platform (CDP) to manage event pre-registration data, and using marketing cloud software to introduce attendee reductions due to the pandemic. The technology helped create a seamless experience for both event organisers and fans. While this kind of technological automation has been focused around improving customer experience, it has the potential to impact employee experience (EX) as well.
The full notion of data mesh is the recognition that how we model, produce and consume data is decoupled. With decoupled data, the common concern is If users need to access data or services that they didn’t create, how are they going to find it and learn to use it? It is this part of the data mesh that affects the data discovery the most. Data mesh splits up the centralized data into data domains and allows users to apply high-quality data product thinking to how data is shared. Data discovery is essentially a capability of enabling data and control plane on the data mesh, which creates a better environment for discovering and tagging data. The companies that already have a data mesh model need a data discovery platform initially to discover and understand their data which is where discovery starts with data mesh. Then, as the data teams start owning their data by putting tags and ownership, data mesh allows these teams to invite other users through democratized access to data while maintaining full governance and control over a source of truth with distributed ownership — this is the main intersection of discovery and its role in data mesh.
It doesn't matter how skillful you’re at coding or creating models, if you can’t understand the business you’re in, you won’t succeed as a data scientist. No one is going to value your work if you don’t help the company you work for reach its goal — no matter how good your model performs. Do some research to know more about the company you work for, the industry they’re in, and take some time to think how a data scientist like you can help reach some of the goals the company has. Asking questions to your boss and colleagues helps a lot too. They can help better understand your role in the company. Every year a new technology rises, tools get obsolete and code becomes deprecated. This is why you should always keep an eye on the new stuff in the field by reading blogs, research papers, and books. Staying up to date will help you take advantage of the latest advancements in data science. As a result, you will be one step ahead of the rest and in most cases get the best performance possible in a project. Being open to learning how to use new tools and adapting to change is a habit that will help you in your career (this leads us to our next habit)
Clearly, DevSecOps requires a high degree of trust between the developer and security sides of the organization, according to Dooley. In part, that’s because DevSecOps is ultimately best delivered through automating security as much as possible during app development. For getting to a true DevSecOps program, security teams must start by providing data to developers that is presented in the form in which they operate—which for many DevOps teams is through a Jira ticket, Dooley said. “Show up in the packaging and format that they’re used to, and supply them with all the information that they need to do to just treat [security issues] like a bug or like a feature,” he said. Thus, the first level on the journey to DevSecOps can involve supplying developers with a secure code sample that fixes a certain issue in the code, Dooley said. But this secure code still needs to be implemented manually. At the next level, companies can enable semi-automated remediation, he said. This can involve automatically disabling issues that are creating a security exposure. With this approach, a human still has to sign off on the final build.
The 2009 Oscar winning sports drama starring Sandra Bullocks was released at a time when racial bias and discrimination was quite prevalent in the world. This heart-rending story is about a seventeen year old African American foster child who is taken into care by a rich southern family whose matriarch found something precious inside him that no one could - a striving will for excellence. What HR professionals can learn from the movie - This movie is a classic example of the ‘nature vs nurture’ discussion. Sandra Bullock’s Oscar winning character practices what she preaches i.e equality, care and sensitivity beyond the societal prejudice and bias and utilising it to unleash the potential of the person. Earlier, HR leaders may tend to overlook the smallest details of human personalities but in the new world of work, these nuances have become quite integral in managing your workforce and motivating them forward. It is important to understand every contributor is bringing something to the table and just like Leigh Tuohy from the movie, it is really important for the HR leaders to distinguish the difference in experience of their employees and nurture them to their strengths.
Everything from customer lists to intellectual property is in danger of being exfiltrated and for a variety of reasons. It may be that employees feel entitled to the intellectual property they helped create. It may be that they want to jump start their work with a new company and that taking data, in their minds, is no crime. It may be that they’re disgruntled and looking to do harm by selling the data to a competitor. In the most extreme case, the employees in question may even be cybercriminals-for-hire who infiltrate enterprises with the sole purpose of causing harm. Amid all these possibilities, it is no longer reasonable for organizations to risk leaving the protection of their most valuable data up to chance. Organizations with valuable intangible assets must consider performing digital forensic scans as part of their standard operating procedures when offboarding employees, to identify threat actors and the data they’re exfiltrating. Doing so would enable organizations to better manage the risk of insider activity and give them the means to recover critical data and in some instances even seek legal recourse.
External threats are not the only problem. The disruption of the past 18 months has seen many organisations realise that their strategic security posture was behind the curve when the request for remote work became a requirement. For example, research conducted during the summer of 2020 found that the biggest threat noted by security departments was the inability to implement multi-factor authentication to facilitate secure remote worker access, something that should have been in place already. However, with resources stretched to breaking point by the immediate demands of the almost exclusively remote workforce, many organisations had to leave strategic security projects on the back burner. Now, as they urgently try to catch up, security teams are also facing an industry-wide shortage of talent. The latest data from the 2021 (ISC)2 Cybersecurity Workforce Study estimates that an additional 700,000 professionals have joined the cyber security sector, but that the gap between the number of additional professionals needed to adequately defend organisations and the number currently available stands at 2.7 million.
Due to the ever-evolving landscape of open source libraries, vendors, and approaches to building ML models as well as the shortage of qualified ML engineers, there is a significant lack of industry best practices for creating deployable, maintainable, and governable ML models. When developing ML models with governance in mind, the most important considerations are reperformance, version control, interpretability, and ease of deployment and maintainability. ... Reperformance is the ability to reperform or reproduce a transaction or a model training and obtain identical results. Much has been said about the “reproducibility crisis” in science, and the AI/ML community is not immune from this criticism. Creating machine learning systems that are reproducible is definitely possible, and putting in the effort up front to do so ultimately yields more robust, dependable deployments, fewer headaches for developers, and fewer questions from auditors and other reviewers.
Quote for the day: