A typical coding interview will involve presenting a candidate with a technical problem, which they'll have to solve in real time and in front of the interviewing panel. While these typically vary from one company to another, one common format is whiteboard coding, whereby a candidate might be asked to provide a solution to a problem involving a binary tree. It was a binary tree task that drew the ire of Howell in his now-famous tweet. These are a fairly typical part of technical interviews, designed to assess a candidate's ability to solve a programming problem and show their thinking 'out loud'. Still, most programmers say this isn't representative of anything they'd have to do in their day-to-day job, and say it's an outdated means of assessing candidates that doesn't reflect their skill level. "These little challenges don't show the greater skill sets, which for me are the ability to construct large programs," says Howell. "It's not about small algorithms. It's about the design of larger systems, and that's way more important." Howell also sees traditional coding interviews as being reflective of an industry that focuses too much on building at speed. "It's partly because the software industry moves so fast," he says.
Factories where engineers or scientists are using smart glasses to obtain virtual guidance, operators working with remote vendors to detect equipment failures in real-time, or interactive training sessions planned by directors located in another continent, are already here. “The barriers to adoption are decreasing as the AR industry becomes more robust,” notes Stracquatanio. Probably, the biggest advantage of AR is it enables seeing the production process virtually, without the need to be there. “It’s a game-changer for the industry. Individuals can have eyes and ears on site at a moment’s notice to address an emerging issue, or to host routine remote collaboration sessions,” Stracquatanio highlights. AR can also increase control over the manufacturing process. Pharma and biotech companies cannot afford mistakes during the production phase. A little oversight might lead to serious consequences such as having to start from scratch, which can be very expensive and time-consuming. A recent example is that of Johnson & Johnson’s manufacturing partner Emergent BioSolutions, whose workers erroneously mixed ingredients from two different Covid-19 vaccines; this led to wasting around 15 million vaccine doses.
Cybercriminals are increasingly leveraging fileless malware, cryptominers and encrypted attacks, targeting users both at remote locations as well as corporate assets behind the traditional network perimeter. These were among the findings of WatchGuard Technologies’ Internet Security Report for Q4 2020, which found fileless malware and cryptominer attack rates grew by nearly 900% and 25%, respectively, while unique ransomware payloads plummeted by 48% in 2020 compared to 2019. The report also found botnet malware targeting IoT devices and routers became a top strain, among them the Linux.Generic virus (also known as “The Moon”), malware which is part of a network of servers that directly targets IoT devices and consumer-grade network devices, like routers, to exploit any open vulnerabilities. Total network attack detections grew by 5% in Q4, reaching their highest level in more than two years, while total unique network attack signatures showed steady growth as well, with a 4% increase compared with the third quarter of 2020. “We believe the increase in endpoint attacks between 2019 and 2020 is largely due to the widespread rise of remote work in response to the global pandemic,” Corey Nachreiner, WatchGuard CTO, explained.
Passwords are still the most common form of user authentication, “protecting” accounts, devices and systems, but alone, they don’t provide strong security. Not only that, they don’t offer the best user experience. Many passwords don’t even meet the minimum criteria of being unique and complex. People reuse passwords across accounts because they simply can’t keep track of all the logins they have. They choose passwords that are easy to remember to ease the burden, but that makes them easy to guess too. In fact, our research shows that people reuse their passwords across an average of ten personal accounts, while ‘123456’ still topped the list for the most common password in 2020. Even when they have chosen well, their unique and complex password can still fall victim to a modern phishing attack. After all, even an exemplary password can’t protect an account if the holder has been tricked into providing the information. From a user experience perspective, you have the stress and strain of choosing a unique, complex password each time that also meets the criteria demanded by the platform or service provider.
“Cyber crime economies are shaping the character of nation-state conflicts,” said McGuire. “There is also a ‘second generation’ of cyber weaponry in development that draws upon enhanced capabilities in computing power, AI [artificial intelligence] and cyber/physical integrations. One such example is ‘Boomerang’ malware, which is ‘captured’ malware that can be turned inward to operate against its owners. “Nation states are also developing weaponised chatbots to deliver more persuasive phishing messages, react to new events and send messages via social media sites. In the future, we can also expect to see the use of deepfakes on the digital battlefield, drone swarms capable of disrupting communications or engaging in surveillance, and quantum computing devices with the ability to break almost any encrypted system.” To ease rising tensions and prevent nation states from being drawn into more hostile cyber attacks, 70% of the expert panel said they thought some kind of international treaty would ultimately be necessary – this is by no means a new idea – but just 15% of them thought a cyber convention would be agreed on this decade, 37% said it was more likely to come in the 2030s, and 30% said it would probably never happen.
Trapped-ion qubits were used to implement the first quantum logic gates in 1995, and the proposal for a quantum charged coupled device (QCCD) – a type of quantum computer with actions controlled by shuffling the ions around – was first made in 2002 by researchers led by David Wineland of the US National Institute of Standards and Technology, who went on to win the 2012 Nobel Prize for Physics for his work. Quantum gates have subsequently been demonstrated in multiple platforms, from Rydberg atoms to defects in diamond. The quantum computing technology first used by IT giants, however, was solid state qubits. In these, the qubits are superconducting circuits, which can be mounted directly on to a chip. These rapidly surpassed the benchmarks set by trapped ions, and are used in record-breaking machines from IBM and Google: “Working with trapped ions, I would be asked by people, ‘Why aren’t you working with superconducting qubits? Isn’t that race pretty much already settled?’,” says Winfried Hensinger of the UK’s University of Sussex. Recently, however, the progress made using superconducting circuits appears to be slowing as quantum computers integrate more and more qubits.
MPC, or multi-party computation, solves for confidentiality by utilizing a network of computation nodes that compute directly on encrypted data while maintaining zero knowledge about the data. For example, an employer may want to find out the average age of each of their employees. For privacy reasons, these employees may not be willing to share their ages, so through secret sharing, the employees can share their age without their age being publicly identifiable to them. The possibilities this technology enables are endless, and one must only think of the benefits such technology could bring to industries such as banking and insurance. While MPC solves for privacy, blockchain itself can protect the individual data against data breaches via the decentralization of sensitive information. Alone, blockchain lacks the infrastructure required to ensure data remains private. ... Not only is the pairing of MPC technology and blockchain a better solution to safeguarding consumer data to those currently in existence, it is one of the most viable solutions that effectively deals with the monumental problem of data security.
GPipe is a distributed machine learning library that uses synchronous stochastic gradient descent apart from pipeline parallelism to train any DNN containing multiple sequential layers. GPipe partitions a model across various accelerators and spins small batches of training examples to even smaller batches. Hence, GPipe’s accelerators can operate parallelly and maximise the scalability of the training process. It allows easy deployment of more accelerators to train large models and further scale the performance without tuning hyperparameters. GPipe is a distributed machine learning library that uses synchronous stochastic gradient descent apart from pipeline parallelism to train any DNN containing multiple sequential layers. GPipe partitions a model across various accelerators and spins small batches of training examples to even smaller batches. Hence, GPipe’s accelerators can operate parallelly and maximise the scalability of the training process. It allows easy deployment of more accelerators to train large models and further scale the performance without tuning hyperparameters.
Both private and professional lives are slowly readopting former practices, such as eating inside a restaurant. As we cautiously return to normal, road warriors are ready to get back on the road, but we're also excited to keep some of the improved healthcare, restaurant and retail experiences we've discovered over the last year. Respondents cited the top four things they said they missed while working remotely: Spontaneous interactions with colleagues I wouldn't have talked to otherwise; Simply being around other people; Exposure to a diversity of perspectives and ideas; and Productivity. Qualtrics discovered that respondents found improved productivity (51%) and well-being—two times more likely than those who say it declined—during the pandemic lockdown. Managers concur: 55% said their direct reports have been more productive. Generationally, 54% of millennials said they're more productive, 53% of Gen Z, 48% of Gen X and 34% of boomers agree. Productivity has improved due to flexible schedules (31%), no commute (26%), more control over workspace (24%), ability to focus with fewer work interruptions (24%) and more privacy and personal space (23%).
All of this saves time and helps them be more effective at mitigating threats and reducing risks. CTI allows the SOC to see beyond the perimeter, so they are aware of threats before they hit their infrastructure. That allows the SOC time to prepare, tweak defenses, such as deploying specific monitoring rules or knowing what to be on the lookout for. And when dealing with incidents or alerts, having this additional context allows them to place the individual alert, or maybe alerts they are dealing with, in the wider context of who is behind it, what their aims are, while typical next steps would be, or maybe even what must have gone before for this to occur. All of that makes it easier to determine how to respond. And when dealing with multiple alerts or incidents, as SOCs do, having this context allows you to prioritize, separating the wheat from the chaff as it were. And that’s critical as many SOCs are resource strained, and so knowing which items to focus on can help with making the most effective use of limited resources.
Quote for the day:
"It's good to trust others but, not to do so is much better." -- Benito Mussolini