The more common, and much harder to deal with problem, is IoT devices located in places that can’t easily be reached. Last fall, for example, I wrote about the (voluntary) recall of almost half a million St. Jude Medical IoT devicesdue to a risk of hacking. No big deal, right? Devices get recalled for security fixes all the time. Unfortunately, in this case, the devices involved were pacemakers installed not in some easy-access equipment rack, but in patients’ chests. Swapping them out would be a verybig deal (fortunately, as of publication of the post, none of the pacemakers had actually been compromised). The issue goes much further than watches and pacemakers. Smart cars have IoT systems that will become obsolete long before the vehicles in which they’re installed reach the end of their useful lives (sort of like aging 8-track players still riding around in the dashboards of cars from the 1980s). And it gets worse. For example, many industrial sensors essential to delivering the benefits of IoT are located in hard-to-reach spots where replacement or upgrades would be difficult, expensive, or hazardous.
“It is scary that for many young people, without the right words at the right moment. It is all too easy to end up going down the wrong path, not because they are bad people, but just because they are looking for opportunities to apply the skills they have. I had no interest in stealing credit cards, but I did want to hack stuff,” Lyne added. According to Lyne, there are two important issues that industry in general, and the cyber security industry in particular, needs to focus on. ... “I have spoken to many amazingly talented people coming out of competitions aimed at finding cyber security talent who are shocked that their skills can actually be used as a career. ... “Second, we need to ensure that employers are not overlooking talented people by having unrealistic recruitment criteria. We are seeing people who have proven that they have the right skills, and they struggling to find a job because employers are insisting on things like five years’ experience or formal certifications. As a result, they are struggling to get into the industry to prove their worth to get on to the career ladder,” said Lyne.
"Innovation should become a day-to-day element of who you are," he says. "Once you get into a can-do mind set, and innovation starts to feel good, you start to develop confidence. You then realise it's OK to try things and make mistakes, and the business starts to create a culture of innovation." O'Connor says the firm has acquired many pioneering startups, where the entrepreneurs remain part of the wider company. That retained knowledge meant that when he had an idea, O'Connor could seek out mentors who had been on a similar journey beforehand. This supporting organisational culture provided guidance, structure and control through an informal mentoring network. O'Connor says that "mini Silicon Valley effect", where you have a critical mass of people who care and collaborate, is when you start making stuff happen. "An idea in isolation is a seed in concrete -- it can't take root. Idea generation is just part of the problem. You need the framework and culture to prove that thinking creatively can produce great results," he says.
What Facebook is not about is data misuse. That, along with spam, fake news, and clickbait, are things that happen on Facebook, as a recent apology ad from the company put it, but they’re not what Facebook is about. What does Facebook do? It connects. What is Facebook? A community. What is Facebook for? It’s for friends. Research shows that people become closer to each other through intimate self-disclosure. But there’s only so much connecting social-media platforms can do if people are too concerned about privacy to use them for the full breadth and depth of human communication. Paradoxically, these tools that were built to bolster relationships may, by their very nature, be keeping people at a distance from each other. I recently conducted a survey, trying to determine how much people censor themselves on social media and whether the Cambridge Analytica scandal has changed their behavior on Facebook and other platforms. I also shared my survey results with Sauvik Das, an assistant professor of interactive computing at the Georgia Institute of Technology, and Sarita Schoenebeck, the director of the Living Online Lab at the University of Michigan.
For those companies that have done absolutely nothing, I think they're probably in a worse position, if not a really bad position to be quite honest, because you never really know when a regulator might start knocking on the door. But if you've got a partial program in place, you can show an effective program or an effective project plan for putting that program in place, I think most regulators would understand that this is an evolving thing for most organizations and as long as you're in business, you're going to have data, you're going to have to come up against new challenges with the data, as your business moves and changes and modifies, you're going to have to change your data protection profile. So it's going to be an evolution. To me, I think, the date last Friday was really the beginning of the end. Or the end of the beginning, sorry. In regards to putting an effective program in place for data management and data protection.
A core component of retaining your employees, regardless of their position, is learning what matters most to them. When it comes to tech professionals, remote work options and autonomy are the most in-demand perks. First, prioritize remote work options. If entirely remote positions don’t mesh well with your collaborative company culture, consider allowing tech-focused employees to work remote for three to four days per week. With the emergence of group chat apps like Slack, having employees work remotely doesn't have to mean decreased communication or collaboration. In fact, it might actually lead to increased collaboration, productivity and innovation. Second, give your employees the autonomy they need to get the job done. The reason that many tech professionals are paid so well is because they have niche skills that most other people don't. Trust them and the knowledge they have to get the job done well. There's rarely, if ever, a need to micromanage high-end software development talent. If developers or software engineers feel like they are constantly asked to do things a certain way that doesn't make sense to them, they won't stick around long enough for you to solve the management problem.
Email is the number one threat vector, according to Barracuda researchers, precisely because it allows malicious third parties to directly target employees within an organisation, underlining the importance of user education around email-related cyber threats. Despite the availability of tools and technologies such as email encryption, data loss prevention, social engineering detection, phishing simulation and artificial intelligence that can help mitigate these threats, the survey revealed that the vast majority of respondents believed user training and awareness programmes were a vital pre-requisite to improving email security. Survey respondents recognised the insider threat, claiming that poor employee behaviour (79%) was a greater email security concern than inadequate tools (21%). There was most concern over individual staff members falling victim (47%), although executives (37%) were also viewed as a potentially dangerous weak link in the security chain. Finance (26%) and sales (18%) departments were viewed with most caution. Topping concerns for respondents were the fact that these roles and departments have access to sensitive information and systems and were most likely to be targeted.
Google still lags behind AWS and Microsoft Azure in public cloud capabilities, but it has added services and support in recent months to shake its image as a cloud valued solely for its engineering. Google must expand its enterprise customer base, especially with large organizations in which multiple stakeholders sign off on use of a particular cloud, said Fernando Montenegro, a 451 Research analyst. Not all companies will pay the premium for this functionality, but it could be critical to those with compliance concerns, including those that must prove they're on dedicated hardware in a specific location. For example, a DevOps team may want to build a CI/CD pipeline that releases into production, but a risk-averse security team might have some trepidations. With sole tenancy, that DevOps team has flexibility to spin up and down, while the security team can sign off on it because it meets some internal or external requirement. "I can see security people being happy that, we can meet our DevOps team halfway, so they can have their DevOps cake and we can have our security compliance cake, too," Montenegro said.
Merge replication is a common technique employed by relational databases. This technique allows you to deploy a distributed database solution in which each database server has its own copy of data. An external agent then collects changes to the local copies and merges them in an effort to force all of the database servers to contain the same copy of data. The topology of typical merge replication has database servers in multiple regions and follows the publisher/subscriber model. One of the servers is identified as a primary server or a publisher, while the rest of the servers are subscribers. In a normal flow, all changes to the publisher trickle down to the subscribers. However, in merge replication, subscribers can make database changes too, and merge all their local changes with the publisher. These changes will eventually go to all subscriber servers. Assuming no conflict occurs during the merge, all changes made to either the publisher or the subscribers will eventually converge as the same copy. The “Merge Agent” is an external service responsible for gathering all the changes to the local database servers and merging them into a single data set. When a conflict occurs, the merge agent follows a predefined set of rules to resolve the conflict.
This industry likes to abandon technologies as soon as it adopts them, but a few find a way to hang around. I recently purchased a car, and in the finance office was a dot matrix printer, chugging away at the same multipage forms I saw used more than 25 years ago. Tape backup is also hanging in there. With data being produced in ever-increasing numbers, it has to be stored somewhere, and hard drives aren’t enough. For true mass backup, enterprises are still turning to tape backup, and the LTO Program Technology Provider Companies (TPCs) say 2017 shipments grew 12.9 percent over 2016 to 108,457 petabytes (PB) of tape capacity. LTO TPCs is a group consisting of three tape backup providers: HPE, IBM, and Quantum. There are other tape backup providers, such as Oracle, which inherited the StorageTek business from Sun Microsystems and still sells them, but it was not included in the count. Actual unit shipments dropped slightly in 2017, which the organization attributes to customers waiting for new LTO-8-based units to ship. LTO-8 technology offers a compressed transfer rate of 750MB/sec., an improvement over LTO-7’s 400MB/sec. And capacity is increasing to 30TB compressed per cartridge, which is up from 22TB compressed in LTO-7.
Quote for the day:
"You can't lead anyone else further than you have gone yourself." -- Gene Mauch