Daily Tech Digest - March 24, 2019

Service Brokering & Enterprise Standard - Build Your Competitive Advantage In The Digital World

Implementing service brokering within an organization requires a fundamental change in culture as the focus needs to evolve from function/technology to service and service delivery. Rather than silos focused around technologies, the organization should rally around teamwork to deliver each service in an optimal way as the broker is central in the integration process between provider and consumer. This is the most difficult aspect when implementing brokering. Changing the way people work, evolving their behaviors to be more user focused takes time. Unfortunately, IT departments have no choice, either they are able to deliver the services required by the users through the supply chain they have developed or they will focus on managing the legacy environments, which may not be seen as a very exciting job. Multiple service use cases are documented in the guide. For each of them the roles and responsibilities of each of the players differ, but efficient service delivery can only be assured if the providers work smoothly and transparently together. 


Technical Debt and Scrum: Who Is Responsible?

Technical Debt & Scrum: Who Is Responsible?
The issue is that there is not just the typical hack, the technical shortcut that is beneficial today, but expensive tomorrow that creates technical debt. (A not uncommon tactic in feature factories.) There is also a kind of technical debt that is passively created when the Scrum Team learns more about the problem it is trying to solve. Today, the Development Team might prefer a different solution by comparison to the one the team implemented just six months ago. Or perhaps the Development Team upgrades the definition of “Done,” thus introducing rework in former product Increments. No matter from what angle you look at the problem, you cannot escape it, and Scrum does not offer a silver bullet either. ... the Scrum Guide is deliberately vague on the question of who is responsible for the technical debt to foster collaboration and self-organization, starting with the Scrum values — courage, and openness come to mind — leading straight to transparency and Scrum’s inherent system of checks and balances.


uncaptioned image
Cybersecurity is an attractive career for ambitious people and a great way to make the world a better place. If you want a career in cybersecurity, don’t wait. You don’t need to be of a particular age or gender. You don’t need any particular approval or certification or study place to get going. Just start learning and start doing. Get involved any way you can. Bug bounties is a great way to learn and test your skills. Check out Hacker101. Just know that even if you can jump straight in, you will need skill, tenacity and patience to ultimately reach a rewarding level of proficiency. Bug hunters may need a year or two of learning before the start finding security vulnerabilities worth reporting. Most bug hunters study the Hacktivity feed where vulnerability reports are published once the vulnerability has been fixed. Also note that to go far and to become a technical expert on cybersecurity, a lot of studying will be needed. What you invest in learning will come back as career opportunity. A degree in Computer Science will not hurt.



Three Steps to Regain Control over your IT Landscape

Most IT landscapes of larger companies consist of hundreds of applications that are interconnected via poorly designed interfaces. In most companies, these IT landscapes already have an enormous technical debt (i.e., an ‘unnecessary complexity’). In my experience, a company typically runs between 80% and 90% more IT applications (and therefore also servers, databases, networks, costs) compared to what would be needed if it had implemented the ideal architecture. A tremendous waste of money and resources, and the reason why IT is perceived as tardy and as a cost factor and not as an enabler. From my point of view, there are three major reasons for this disastrous situation ... There is a tendency to blame the IT department for this situation, but that’s not true. It’s a business problem. Requirements are typically not consolidated well across departments. IT has always just been the contractor who had to implement those punctual requirements under time pressure.


Like Football, Your Cybersecurity Defense Needs a Strong Offense

Like Football, Your Cyber Security Defense Needs a Strong Offense
Today, it’s essential to not only build the strongest possible defenses but also to deploy creative strategies to gain information on your attackers and how they are trying to breach your networks and penetrate your systems. This idea that “the best defense is a good offense” is not just a slogan representing the conventional wisdom of the cybersecurity intelligentsia. ... In “The Future of Cybersecurity: The Best Defense Is a Good Offense,” the company speaks directly to all organizations when it waves the following red flag: With the sophisticated techniques threat actors are using to mask their activities, the traditional approach of ‘building bigger fences’ will no longer suffice. The only way organizations can protect themselves is by unleashing offensive cyber techniques to uncover advanced adversaries on their networks. As an example of what going on the offensive might look like, one strategy the company uses is to configure fake computers in a phony, intentionally vulnerable network that functions as “a virtual mousetrap” to lure cyber adversaries; when the hackers bust in, they reveal valuable information about their identities, tactics and intentions.


Cybersecurity: Don’t let the small stuff cause you big problems

Organisations of all sizes in all sectors need to have a cybersecurity strategy, but for healthcare, it's particularly important. Not only do IT networks within hospitals and doctors' surgeries need to be accessible and secure in order to provide patient care, these networks involve medical information – some of the most sensitive data that can be held about people. "What's really important is having control over the data and knowing where it is. It's the same issue that's dealt with in many other industries, but to an extra level of duty of care for the people whose data you've got," said Sian John, chief security advisor for EMEA at Microsoft. "You're talking about privacy: it's one level when you're talking about financial data, it's another level if that's my medical history," she added. What's important for health organisations as a whole is being absolutely sure how data is controlled and how it is accessed – and making knowing a priority.


Some Cybersecurity Vendors Are Resorting To Lies & Blackmail


It’s hard for cybersecurity companies to get noticed. Smaller vendors particularly struggle because top corporations already have contracts or strong customer relationships with the biggest companies. This is where the threat of negative media coverage comes in. Exposing a security flaw, no matter how small, can garner big headlines if it’s at a big company. Enough press coverage can spark weeks of outrage and land top leaders in front of Congress. However, breaches that actually cause damage are relatively rare. As a result, vendors often try to make a big deal out of minor breaches that don’t expose important company or customer information. For instance, all four executives said vendors tried to draw their attention to potentially exposed data on Amazon and Microsoft Azure cloud servers. None of this data included any current material information. In one case, a database housed business plans for a 10-year-old project that had already been reported on and was now irrelevant. In another case, the data included information about customers — but only their names and the fact that they had attended a technology conference several years earlier.



When Scrum Is Not The Right Answer

As organizations have bought into adopting an Agile approach to software development, I've noticed that one corporation's identification with terms like Agile or Scrum may differ from another's. Almost as if they are deciding how they wish to utilize Agile concepts to best meet the needs of their teams. I am really okay with this approach, as I noted in the article, "Agile For the Sake of Being Agile." But, what if Agile or Scrum is not the right answer at all? ... While the flow is certainly more Kanban than anything else, the goal is to keep the flow of work moving forward. Tickets pushed back to the to-do column would not need to go back to the original developer, but could be handled by any other developer, since the code has since been merged. An alternate flow could be that the REVIEW and TEST columns are swapped, delaying the merge until after testing has completed — but that was not suggested initially, since in order to keep the flow of working moving as quickly as possible. After all, the key is to meet the aggressive deadline.



Keep in mind, a cloud move is not as simple as downloading new software. It’s an entirely new and different ecosystem, one that involves a list of risks: legal, financial, commercial, and compliance, to name a few. To make such a move without stopping long enough to become informed of the dangers is not a good idea. It’s also not as simple as learning which vulnerabilities and threats are sitting out there at any particular moment in time. Threats evolve over time. Old ones become less effective or fall out of favor with hackers and new ones emerge. ... The problem is that you don’t have direct access to see where your data is stored and verify that deleted data has actually been deleted. To a large extent, you have to take it on faith that your CSP does what it says. Consider the structure of the cloud. There’s a good chance your data is spread over several different devices and in different physical locations for redundancy. Further, the actual deletion process is not the same among providers.


Why We Are Making Things so Complicated

There are many reasons: First, Joseph is dealing with the laws of physics – in a brilliant way I should add. In the virtual world of software-based solutions, such laws don’t apply. Furthermore, I suspect that Joseph had to go to a dozen stores to buy all this apparatus and spend a lot of time finding the right gizmos to fit his process. In software-based solutions, you just click, download it, resize it, or copy and paste it ad infinitum if you wish. It is usually simple, often effortless. It can also go in all directions, augment the overall complexity, but still your IT staff will find a way to make it work. In other words, the drawback of computer-based solutions is that it is easy to “clog your kitchen” as in the video. Second, after Joseph is done with video-making, he cleans the kitchen before the in-laws come for dinner. Your IT-based solutions support your business and they stay there as long as you’re operating. As easy as it is to fill the kitchen with software-based components, it is proportionately as difficult to empty the room – unless it was planned for.



Quote for the day:


"Brilliant strategy is the best route to desirable ends with available means." -- Max McKeown


Daily Tech Digest - March 23, 2019

Digital Convergence’s Impact on OT Security

istock 1023224312
A significant component of the challenge is that IT and OT networks are founded on very different, and often highly contradictory priorities. IT networks generally follow the well-established Confidentiality/Integrity/Availability (CIA) model. The emphasis in on ensuring the confidentialityof critical data, transactions, and applications, maintaining network and data integrity, and only then ensuring the protected availabilityof networked resources. These priorities tend to be the basic building blocks of any security strategy. Conversely, OT networks depend upon and operate with an exactly inverted model. The safetyand availabilityof resources is the topmost priority. Assembly lines, furnaces, generators, and other large systems simply should never go offline. Monitoring critical systems, such as pumps, valves, and thermostats is essential since any system errors can translate into huge financial loss, and pose catastrophic risk to the life and well-being of workers and communities.



Why Isn't Your Current Approach to Scaling Agile Working?


When looking to scale organizational agility, the people in your organization need to own their new way of working. For that to happen, they will have to create their own process that works in their specific context. When people create their process, they will learn what works for them, and then a new culture ‘the way we do things here’ will emerge. To implement someone else’s model is like providing an answer before knowing the question, which likely will not be successful. Instead, consider to start with the simplest process that works; then build upon it using Empirical Process Control and a framework that makes transparent to all what to improve; that framework is called Scrum. There is a story that in 2001 Toyota wanted to publish a book called “The Toyota Way”. Upon hearing of this, their CEO said he opposed the title, suggesting it should be called “The Toyota Way 2001” because next year their way of working would have changed.


Six Recommendations for Aspiring Data Scientists


One of the skills that I like to see data scientists demonstrate is the ability to make different components or systems work together in order to accomplish a task. In a data science role, there may not be a clear path to productizing a model and you may need to build something unique in order to get a system up and running. Ideally a data science team will have engineering support for getting systems up and running, but prototyping is a great skill for a data scientists to move quickly. My recommendation here is to try to get different systems or components to integrate within a data science workflow. This can involve getting hands on with tools such as Airflow in order to prototype a data pipeline. It can involve creating a bridge between different systems, such as the JNI-BWAPI project I started to interface the StarCraft Brood War API library with Java. Or it can involve gluing different components together within a platform, such as using GCP DataFlow to pull data from BigQuery, apply a predictive model, and store the results to Cloud Datastore.


Three Questions to Gauge Emotional Intelligence

For work teams to succeed, your employees need to trust one another. It’s been found that high-trust environments promote higher worker engagement, with the research finding that on the opposite end, when trust is compromised, people “become withdrawn and disengaged.” ... Building trust requires multiple emotional intelligence competencies. It means understanding what the other person is expressing, sensing what they’re feeling, being conscious of your own behavior, and altering your behaviors with each individual. I’ve found this interview question is a great opportunity to probe how much thought a candidate gives to all these elements. ... Increasingly, employees and customers are flocking to companies that have a social purpose — a desire to do something good for the world — in addition to their profit motives. EY reports that these companies have been shown to far outperform the S&P average. If your company has a purpose, a candidate who has prepared for the interview will likely know it. But asking them to recite a line they read somewhere on your corporate website won’t tell you much.


Improve help desk management for smooth IT operations


A regular time sink in IT management is duplicate work in the help desk from a lack of communication among systems administrators, developers or other support staff. Recurrent problems are fixed superficially and are liable to arise again in a future ticket. Each fix increases the burden of platform maintenance, as help desk agents apply change after change. While specific log restraints streamline issue management, industry analyst Clive Longbottom presented another option for help desk management improvement: Adopt a natural language processing and knowledge management system. NLP augments help desk management with a system that analyzes the language in tickets, compares it to previous entries and helps identify patterns. Knowledge management also helps discover relationships between current and past issues and alerts IT staff to those connections to provide greater context for resolution. Legacy IT service management systems are reactive and require a person or machine to open the ticket before it can be resolved. Through the implementation of AI, IT teams turn the help desk into a proactive system -- and reduce their workloads.


Defining a Distinguished Engineer

A technical leader should build up others and empower their colleagues to do things that are more challenging than what they might think they are capable of. This is key for growing other members of an organization. I personally believe you don’t need a high title to take on a hard task, you just need the support and faith that you are capable of handling it. That support should come from the distinguished engineer and be reflected in their behavior towards others. A technical leader should also make time for growing and mentoring others. They should be approachable and communicate with their peers and colleagues in a way that makes them approachable. They should welcome newcomers to the team and treat them as peers from day one. A distinguished engineer should never tear others down but they should be capable of giving constructive criticism on technical work. This does not mean finding something wrong just to prove their brilliance; no, that would make them the brilliant jerk


Why AI will make healthcare personal

A control monitor during a heart catheterization operation.
AI is already contributing to reducing deaths due to medical errors. After heart disease and cancer, medical errors are the third-leading cause of death. Take prescription drug errors. In the US, around 7,000 people die each year from being given the wrong drug, or the wrong dosage of the correct drug. To help solve the problem, Bainbridge Health has designed a system that uses AI to take the possibility of human error out of the process, ensuring that hospital patients get the right drug at the right dosage. The system tracks the entire process, step-by-step, from the prescription being written to the correct dosage being given to the patient. Health insurance company Humana is using AI to augment its human customer service. The system can send customer service agents real-time messages about how to improve their interaction with callers. It’s also able to identify those conversations that seem likely to escalate and alert a supervisor so that they’re ready to take the call, if necessary. This means the caller isn’t put on hold, improving the customer experience and helping to resolve issues faster.


Agile in Higher Education: Experiences from The Open University

Thinking about the enterprise agility theme, as described in great recent books by Sriram Narayan (Agile IT Organization Design) and Sunil Mundra (Enterprise Agility), I am afraid to say that universities in the UK are going in the opposite direction, by consolidating their academic schools and departments into bigger and bigger mega faculties, and everyone else into 'professional-services' mega units, so you see lots of large, functional, activity-oriented teams in silos with huge costs of communication and collaboration, slow decision making, and low levels of customer focus and staff empowerment. But universities are starting to wake up to the potential of agile, and some are using agility to transform their strategy and delivery at the organisational level. National University of Singaporeis a great example of this for the UK higher education sector. The Open University is the largest university in the UK, with 200,000 students. Each year we produce nearly 200 new online courses, and update 300 more.


AI: A new route for cyber-attacks or a way to prevent them?

AI: A new route for cyber-attacks or a way to prevent them? image
If deployed correctly, AI can collect intelligence about new threats, attempted attacks and successful breaches – and learn from it all, says Dan Panesar, VP EMEA, Certes Networks. “AI technology has the ability to pick up abnormalities within an organisation’s network and flag them more quickly than a member of the cyber security or IT team could,” he says. Indeed, current iterations of machine learning have proven to be more effective at finding correlations in large data sets than human analysts, says Sam Curry, chief security officer at Cybereason. “This gives companies an improved ability to block malicious behaviour and reduce the dwell time of active intrusions.” It is true that AI increases efficiency, but the technology isn’t intended to completely replace human security analysts. “It’s not to say we are replacing people – we are augmenting them,” says Neill Hart, head of productivity and programs at CSI. However, AI and machine learning also have a dark side: the technology is also being harnessed by criminals. It would be short-sighted to think that the technological advancements offered by AI will provide a complete barrier against the fallout of cyber-attacks, says Helen Davenport, director, Gowling WLG.


How Do You Know When A Cybersecurity Data Breach Is Over?

uncaptioned image
The answer is often a surprise. It isn’t over when you’ve removed a hacker or insider threat from your network environment, just as it doesn’t begin with the discovery of patient zero of a cyber attack. It ends when your organizational attitudes toward cybersecurity revert to what they were before the breach. The question is: "Is the return to 'business as usual' a good thing?" Usually not, especially when you think about how the breach began. Most organizations I've worked with assume a data breach begins when a hacker penetrates your network. But it actually starts long before — with the sum of bad security habits, mismanaged mergers and acquisitions, budget decisions that scrimp on security and bad choices like relying on outdated equipment or not deploying security patches. In this way, a breach can be a good thing because it wakes everyone up — it serves as the greatest security awareness exercise possible. When a breach occurs, everyone is interested in information security for a brief duration — from the incident response and mitigation teams to public relations.



Quote for the day:


"Leadership is a journey, not a destination. It is a marathon, not a sprint. It is a process, not an outcome." - John Donahoe


Daily Tech Digest - March 22, 2019

Artificial Intelligence Can Help Or Hurt Any Business


Everyone has heard about the big potential for using artificial intelligence (AI) to expand your business, but many of the small businesses I mentor are still wary of embracing it, because they don’t understand how it works, and fear losing control and unintended consequences. My advice is that AI is here, so it behooves all of us to learn how to use it properly and move forward. For example, it is a no-brainer to first take advantage of the wave of new capabilities for data collection and smarter analysis to improve productivity and marketing. What is not so obvious is how to create and roll out solutions that can directly impact customer trust or financial well-being. There have been too many recent glitches, such as evidence of devices invading our privacy.  To put this all in perspective, I was happy to see the guidance and recommendations on how to deal with artificial intelligence correctly in a new book, “The Big Nine,” by Amy Webb. As a recognized futurist and thought leader in this space, she outlines how the big nine tech titans, including Google, Microsoft, and Alibaba, should be working to solve key long-term issues.



microsoftdatadnastorageautomation.jpg
Researchers at Microsoft and the late Microsoft founder Paul Allen's school of computing science at the University of Washington has built a system of liquids, tubes, syringes, and electronics around a benchtop to deliver the world's first automated DNA storage device.  Using the proof-of-concept DNA storage device, the researchers demonstrated its write and read capabilities by encoding the word 'hello' in snippets of DNA and converting it back to data. The bench-top unit cost around $10,000 but the researchers believe it could be built in low-volumes for a third of the cost by cutting out sensors and actuators. The unit, described in Nature, consists of computers with encoding and decoding software that translate digital ones and zeros into DNA's four bases: A, C, T, G. There's also a DNA synthesis module and a DNA preparation and sequencing module, between which sits a vessel where DNA is stored. Microsoft principal researcher Karin Strauss says the group wanted to prove there is a practical way of automating DNA data storage.


Business leaders disillusioned with business transformation 


“Begin at the beginning, and go on till you come to the end: then stop. That is not same thing, however, as saying that a digital business transformation process should begin without a clear idea of where it is going. Indeed, this is vital. Unfortunately, the Celonis study also found that most organisations are struggling with transformation initiatives because they are diving into execution before understanding what actually needs changing. The research found that 39% of analysts are not basing their work on internal processes when executing the transformation strategy given to them by senior personnel. Celonis suggested that this highlights “that business leaders are investing in transformation initiatives because they think they should and not because they have identified a specific problem.” Businesses are also skipping square one, suggests the report, and are “still jumping straight into tactics.” It gave examples, AI, machine learning and automation. The survey found that 73% of C-suite say that these are areas that they want to maintain or increase investment in. In contrast, a fraction under a third of senior leaders state that they plan to invest more in getting better visibility of their processes.


How tech brings learning and development to deskless employees


"Interestingly, there have been some really significant advancements in brain science, cognitive science," Leaman said. This research looks at how the brain remembers information. "What we know now is that people do what they remember. If they don't remember they will guess. So how do you get people to remember and not guess or just simply not do?" This perspective has shifted learning to the form of micro content, focusing on key learning points that are accessible via a mobile device, just when an employee needs it. Typical use cases include restaurant employees accessing recipe cards, manuals or operational reference material to learn about new promotions, Carr said. Or medical device sales representatives can access and learn about new product information, product launches and new drugs. A fitness club employee can look up the day's workout each morning before teaching it to the class, and a field service tech can look up quick tips on a potential problem before going into a customer's home.


Where Technology Fits in the Employee Experience

two colleagues looking at a computer screen together
Digital transformation, Mike Graham, CEO of Epilogue Systems argues, involves a lot of people over a long period of time. Preparing for staff, budget and time exhaustion — before it happens — is critical to your team and digital transformation. External staff, such as systems integrators and software companies will vanish after the go-live date and internal staff may take themselves out of the project, or completely leave the organization. “Users must be able to effectively adopt the technology themselves in order for digital transformation to reach its full potential,” he said. Think about digital adoption beyond the critical first months. While adoption in the first three to five months after things go-live is critical, it’s a process that’s never complete. Think of all the changes that an application experiences over time: upgrades, shifts in an organization’s application landscape, integrations and APIs, and an increasingly complex digital workplace. Beyond that, there’s also the challenges of the workforce to account for: hiring, turnover, retirement, role changes and business model evolution.


Atlassian, AgileCraft join to scale Agile development


AgileCraft's value stream management technology provides joint visibility for Atlassian's own stack into Azure DevOps Server, Rally and various continuous delivery tools. "What makes AgileCraft interesting is the platform's focus on helping enterprises figure out how to replicate DevOps success by holistically looking at and correlating the business and financial side of things and DevOps process flows," said Torsten Volk, an analyst at Enterprise Management Associates, based in Boulder, Colo. Atlassian's addition of a DevOps analytics platform could replicate the success of one or two high-performing DevOps teams across the entire enterprise, Volk said. "Considering the lack of competition in this arena, I think the $166 million could prove to be money well spent," he said. A key question is whether AgileCraft customers will face pressure to move to the Atlassian stack. Most enterprises use a variety of different products at the teamwork level.


Quantum computing will break your encryption in a few years

meet the bristlecone chip googles 72 qubit quantum computer chip
Quantum computing-based security technology is effective because it relies on two of the best-known properties of quantum physics – the idea that observing a particle changes its behavior, and that paired or “entangled” particles share the same set of properties as the other. What this means, in essence, is that both parties to a message can share an identical cipher key, thanks to quantum entanglement. In addition, should a third party attempt to eavesdrop on that sharing, it would break the symmetry of the entangled pairs, and it would be instantly apparent that something fishy was going on. “If everything is working perfectly, everything should be in sync. But if something goes wrong, it means you’ll see a discrepancy,” said Jackson. It’s like a soap bubble, according to Brian Lowy, vice president at ID Quantique SA, a Switzerland-based quantum computing vendor – mess with it and it pops. “At some point, you’re going to have to factor [quantum computing],” he said, noting that, even now, bad actors could download encrypted information now, planning to crack its defenses once quantum computing is equal to the task.


How to deprecate software features without bothering users

In a situation where the deprecated feature or function will be removed entirely and not replaced, developers should offer suggestions for software layers or tools that can provide a worthwhile alternative, as well as guidance and instruction to help users adopt them. For example, if a custom database is replaced by a third-party database, such as SQL, support staff should ideally help users connect the database to the software and migrate it to the third-party platform. Software deprecation is all about continuity. You must ensure that developers don't alienate the customer base and instead help them through impending product changes to minimalize disruptions for their businesses. Providing continued service requires training for the help desk and support team, as well as helpful documentation in the form of notices, guides and knowledge base entries. Customers use your software to help run their businesses, so you must communicate changes about your product -- especially when you deprecate software features -- well in advance.


AI cloudops is coming, whether you like it or not

AI cloudops is coming, whether you like it or not
The pros are that you can have a 7/24/365 monitoring and management program on the cheap. If you believe operational staff is expensive, try hiring them for shift work. AI-based monitoring and management systems never sleep, never take time off, and never ask for a raise. Once they are up and running, they cost almost nothing beyond their license fees and infrastructure costs. And they are self-learning at the same time; in other words, the more they run, the better that they get at the job. ... One con is that the cost of rolling out these systems is high, even in the cloud. Vendors that have married AI and operational tools are going to charge a premium to get them up and running and in production. While the prices are all over the place, count on paying 50 percent more than for traditional tools, including consulting services for the first year or so to get the tools learning correctly. Another con is that operations people don’t seem to like them no matter how well they perform. The number of passive-aggressive actions that I’ve seen over the years from people pushing back on AI-enabled operations tools has been huge.


How digitalization supplants old insurance models

New technologies are also threatening the long-term viability of credit-based insurance. Carriers are increasingly seeking out data and building predictive models that will prove more powerful and profitable, even during periods of economic volatility. For example, my company, ODN has shown it is possible to extend more policies at more affordable rates to people with poor credit, by pricing risk based on where people drive, rather than who they are. To remain competitive and profitable in the long-term, underwriting and actuarial teams need to pay attention to the dynamics of credit-based insurance today and plan for a future that simply may not include FICO. Carriers should be asking, what will happen to our pricing models if economic conditions or regulators make credit-based insurance irrelevant? What new technologies need to be in place to continue pricing risk and remain competitive in a world without FICO?



Quote for the day:


"Leaders dig into their business to learn painful realities rather than peaceful illusion." -- Orrin Woodward


Daily Tech Digest - March 21, 2019

Industry 4.0 shifting from buzzword to reality, says Hampleton Partners' M&A report

Industry 4.0 shifting from buzzword to reality image
Hampleton’s Industry 4.0 M&A Market Report records more than 600 deals in 2018, up from 513 in 2017. The analysis reveals that the highest level of interest lies in AI technologies with context information, digital threads and digital twin solutions. Dr.-Ing. Peter Baumgartner, sector principal at Hampleton Partners, said: “A mere buzzword a few years ago, Industry 4.0 has become today’s reality and is one of the hottest M&A sectors in the DACH region. Liquidity is at a high level, meaning that buyers have the funds to support start-ups or established Industry 4.0 players, and the cutting-edge technology coming out of the region has generated many M&A deals.” Industry 4.0 has become integral to the region’s technology giants such as Bosch Rexroth, Festo and Siemens, whilst a recent strategic partnership between Rockwell Automation and PTC, accompanied by a $1 billion equity investment from the former, further demonstrates the importance of integrating innovations such as IoT and augmented reality with more traditional industrial automation.


CISOs, Know Your Enemy: An Industry-Wise Look At Major Bot Threats


According to a study by the Ponemon Institute in December 2018, bots comprised over 52% of all Internet traffic. While ‘good’ bots discreetly index websites, fetch information and content, and perform useful tasks for consumers and businesses, ‘bad’ bots have become a primary and growing concern to CISOs, webmasters, and security professionals today. They carry out a range of malicious activities, such as account takeover, content scraping, carding, form spam, and much more. The negative impacts resulting from these activities include loss of revenue and harm to brand reputation, theft of content and personal information, lowered search engine rankings, and distorted web analytics, to mention a few. For these reasons, researchers at Forrester recommend that, “The first step in protecting your company from bad bots is to understand what kinds of bots are attacking your firm.” So let us briefly look at the main bad bot threats CISOs have to face, and then delve into their industry-wise prevalence.


A Comparison Between Flutter And React Native

Image 1 for A Comparison Between Flutter And React Native
As the need for mobile apps increase, developers are looking at ways to build better apps in a faster way. New frameworks are emerging to make work easier for app developers. Developers can create the most attractive native-like apps with Cross-platform app development. These apps provide a better user experience while making the developing process easy and fast. As more and more frameworks emerge, there is a compulsion to compare these and find out which is more suitable. Flutter is a reasonably new framework while React Native has been here for quite some time now. Both these are cross-platform frameworks helping to develop native apps easily. A comparison of these frameworks will help many app developers to decide which will be better for their apps. While Flutter is a product from Google, Facebook had launched React Native. Cross-platform frameworks are a great help for developers because it avoids the need for maintaining two teams for the two mobile platforms.


Resumable Online Index Create and Rebuild Operations

When you cancel an index rebuild or a create index operation in SQL Server prior to SQL Server 2017, the database engine must roll back all the work it had done on the index. Because of this, when you restart the index rebuild or create index process, SQL Server has to start all over again at rebuilding or creating the index. This causes lots of processing and requires resources just to redo what was done prior to cancelling the indexing rebuild or create index process. But if you migrate your older versions of your databases to SQL Server 2017, you can restart your online index rebuild operations. Plus, with the rollout of the previews of Azure SQL Database or SQL Server 2019, you can pause and restart both your online rebuilds and creation processes. Being able to pause these online index operations allows SQL Server to pick up the rebuild or create index operations where they left off.


A typical cell phone has nearly 14 sensors, including an accelerometer, GPS, and even a radiation detector. Industrial Things such as wind turbines, gene sequencers, and high-speed inserters can easily have 100 sensors. People enter data at a snail’s pace when compared with the barrage of data coming from the IoT. A utility grid power sensor, for instance, can send data 60 times per second, a construction forklift once per minute, and a high-speed inserter once every two seconds. Technologists and businesspeople both need to learn how to collect and put all of the data coming from the industrial IoT to use and manage every connected Thing. They will have to learn how to build enterprise software for Things versus people. The industrial IoT is all about value creation: increased profitability, revenue, efficiency, and reliability. It starts with the target of safe, stable operations and meeting environmental regulations, translating to greater financial results and profitability.


jOOPL: Object-Oriented Programming for JavaScript

Web development has increased its complexity during the last decade. Think about how the Web was and in what it turned into now: the Web of applications. Also known as Web 2.0 and the coming 3.0. JavaScript has been the language that accompanied the Web since its early stages. Someday was the way to add some fancy effects to our pages, but as the language has evolved into an actual application programming language, the need to reuse and scale have become an important point in Web development. Definitively, object-oriented approach on graphical user interface, domain and others, has demonstrated that is a good way of creating well-structured, reusable and maintainable software. The worst part is JavaScript is not an object-oriented programming language: it is prototype-oriented, which is a weak approach to leverage few features of a full-fledged object-oriented platform. That is why jOOPL exists. "jOOPL" stands for "JavaScript Object-Oriented Programming Library".


Cash review suggests fintech “is not a panacea” yet


“Fintech is fantastic as it is, but it is not a panacea,” said banking specialist Mark Aldred by email, of ATM software firm, Auriga, the banking and cash management firm. Access to Cash, an independent body established to gauge the effects of going cashless, reported in its final review in early March that 2.2m people rely solely on cash while 8m would struggle in a completely cash-free society. “There are technological developments which could address many of the needs of those who depend on cash,” reads the executive summary, citing the UK’s reputation as a source of financial technology innovation. The word ‘fintech’ appears 19 times, each time exploring how the fledgling sector could better serve the 2.2m. However, the report also acknowledges that fintechs tend to target early adopting consumers as opposed to the majority of late adopters who populate the 8m underserved. “Fintech is seeking to move from its digitally-savvy demographic,” said Aldred. “Key to mainstream adoption of app-only banks and other fintech options will be how trust is developed in availability of these services.


Cyberattacks: Europe gets ready to face crippling online assaults


The agency said to be certain that it was a criminal attack, the electronic evidence that could be found within the IT systems affected by the attack must be preserved, as this is essential for any criminal investigation. "It is of critical importance that we increase cyber preparedness in order to protect the EU and its citizens from large scale cyberattacks," said Wil van Gemert, deputy executive director of operations at Europol.  While European governments and businesses face a range of threats, it is notable the announcement comes ahead of European elections in May and a number of other votes across Europe this year. As well as large-scale ransomware attacks, Europe is keen to stop any repeat of the election meddling that affected the US Presidential election in 2016. In February, Microsoft warned that it had seen recent hacker activity targeting democratic institutions in Europe, including attacks on election campaigns, but also think tanks and non-profit organizations working on topics related to democracy, electoral integrity, and public policy and that are often in contact with government officials, and said that Russian intelligence was behind the attacks.


Cyber security skills shortage driving outsourcing — NETSCOUT research

Cyber security skills shortages driving outsourcing image
Operational challenges are further compounded by difficulty in hiring and retaining skilled personnel, which, together with lack of headcount or resources, were cited as the top challenges faced by security leaders. The findings show that this is driving an increased reliance on outsourced services, with approximately a third of enterprises outsourcing at least a part of their security operation, up 12% from 2017. This trend looks set to continue for the foreseeable future, with 39% of respondents stating they expect to increase their investment in outsourced services in the next 12 months. “In leaning on outsourced security professionals, businesses are identifying the short-falls of their internal processes and capabilities and are moving to address risk in the only way they can,” added Anstee. “There is nothing wrong with this strategy, as long as businesses are clear that they still own the underlying risk.” Adding to the challenge facing organisations is an evolution in DDoS attack size, with 91% of companies experiencing an attack indicating that their internet connectivity was saturated on at least one occasion.


The Dawn Of The Deep Tech Ecosystem

Deep tech startups rarely follow the established funding progression of other types of young tech enterprises—seeking money from friends and family, then angel or seed investors, then successive rounds of venture capital investment at increasing valuations (which validate the decisions of previous investors), leading ultimately to a trade sale or an IPO. In deep tech, public funding plays an important role in the early phase, and friends-and-family money is rarely significant relative to the substantial capital requirements of early R&D. Private-public financing schemes are becoming increasingly important to financing deep tech ventures along their entire life cycle, and corporate venture capital (CVC) funds, incubators, and accelerators also have become prevalent partners since they provide not only funding but other critical forms of support. ... The growing deep tech ecosystem facilitates research into almost any kind of technology, from things we can’t see to concepts that relatively few can explain. This ecosystem is rooted in a handful of trends.



Quote for the day:


"Inspired leaders move a business beyond problems into opportunities." -- Dr. Abraham Zaleznik


Daily Tech Digest - March 20, 2019

Things happen in the real world that don’t happen in your test environment. Yet what does that mean from a QA perspective? We did everything we were supposed to do in the training phase and our AI model passed meeting expectations, but it’s not passing in the “inference” phase when the AI model is operationalized. This means we need to have a QA approach to deal with AI models in production. Problems that arise with AI models in the inference phase are almost always issues of data. We know the algorithm works. We know that our training model data and hyperparameters were configured to the best of our ability. That means that when AI models are failing we have data problems. Is the input data bad? If the problem is bad data – fix it. Is the AI model not generalizing well? Is there some nuance of the data that needs to be added to the training model? If the answer is the latter, that means we need to go through a whole new cycle of developing an AI model with new training data and hyperparameter configurations to deal with the right level of fitting to that data.


TLS 1.3: A Good News/Bad News Scenario

While TLS 1.3 enables much better end-to-end privacy, it can break existing security controls in enterprise networks that rely on the ability to decrypt traffic in order to perform deep-packet inspection to look for malware and evidence of malicious activity. Well-known examples of those security controls include next-generation firewalls, intrusion prevention systems, sandboxes, network forensics, and network-based security analytics products. These security controls rely on access to a static, private key in order to decrypt traffic for inspection. The use of such keys is replaced in TLS 1.3 by the requirement to use the Diffie-Hellman Ephemeral perfect forward secrecy key exchange. That exchange occurs for each session or conversation that is established between endpoints and servers. In addition, the certificate itself is encrypted, which denies those tools access to valuable metadata for additional analysis. The ephemeral key exchange is not new to TLS. TLS 1.2 also included it as an option.


Who's Responsible When IT Goes Awry?

Image: Tashatuvango - stock.adobe.com
"IT professionals tend to be pleasers. They say 'yes’ to a lot of things when they should say 'no," said Dave Gartenberg, chief HR officer at professional services firm Avanade. "Sometimes they'll agree to do something with less budget or less line leader involvement in order to be helpful. You'll see a lot of projects moving forward with the best of intentions when in fact anyone on the outside looking in can see it would never stand a chance. I hold the IT leaders accountable for making sure from the start the conditions for success were contracted internally." Peter Kraatz, portfolio manager of Cloud and Data Center Transformation Consulting at IT solutions services provider Insight Enterprises said the lack of governance also contributes to IT issues. “IT has to own the mechanical bits of governance: Who's got what role, who's going to pull what triggers and when. Why we’re doing them is something that's owned by the business," said Kraatz. "The business has to tell us when we’re running out of budget on Amazon or we’ve got the wrong workloads. I think we’re allergic to talking to one another.”


Raspberry Pi-style Jetson Nano is a powerful low-cost AI computer from Nvidia

nvidiaboard.png
Nvidia released a series of benchmarks showing the Jetson Nano outperforming competitors when running various computer vision models. The results show the Jetson Nano beating the $35 Raspberry Pi 3 (no mention of the model), the Pi 3 with a $90 Intel Neural Compute Stick 2, and the newly released Google Coral board that uses the Edge TPU (Tensor Processing Unit). These tests involved running a range of computer vision models carrying out object detection, classification, pose estimation segmentation and image processing. Specifically, the Jetson showed superior performance when running inference on trained ResNet-18, ResNet-50, Inception V4, Tiny YOLO V3, OpenPose, VGG-19, Super Resolution, and Unet models. The Jetson Nano was the only board to be able to run many of the machine-learning models and where the other boards could run the models, the Jetson Nano generally offered many times the performance of its rivals. Nvidia's senior manager of product for autonomous machines Jesse Clayton told TechRepublic's sister site ZDNet that Jetson Nano's GPU could run a broader range of machine-learning models than the specialist silicon found in Google's Edge TPU.


Terrified Of The Internet, Putin Signs Laws Making It Illegal To Criticize 

Russia's efforts to clamp down on anything resembling free speech on the internet continues unabated. Putin's government has spent the last few years effectively making VPNs and private messenger apps illegal. While the government publicly insists the moves are necessary to protect national security, the actual motivators are the same old boring ones we've seen here in the States and elsewhere around the world for decades: fear and control. Russia doesn't want people privately organizing, discussing, or challenging the government's increasingly-authoritarian global impulses. After taking aim at VPNs, Putin signed two new bills this week that dramatically hamper speech, especially online. One law specifically takes aim at the nebulous concept of "fake news," specifically punishing any online material that "exhibits blatant disrespect for the society, government, official government symbols, constitution or governmental bodies of Russia." In other words, Russia wants to ban criticism of Putin and his corrupt government


Stanford Aims to Make Artificial Intelligence More Human


First, ensuring as best we can that the advancement of artificial intelligence ends up serving the interests of human beings, and not displacing or undermining human interests. The essential thing is to ensure that as machines become more and more intelligent and are capable of carrying out more and more complicated tasks that otherwise would have to be done by human beings, that the role we give to machine intelligence supports the goals of human beings and the values we have in the communities we live in, rather than step-by-step displacing what humans do. Second, the bet that the institute is making here at Stanford is that the advancement of artificial intelligence will happen in a better way if, instead of just putting technologists and AI scientists in the lab and having them work really hard, we do it in partnership with humanists and social scientists. So the familiar role of the social scientist or philosopher is that the technologists do their thing and then we study it out in the wild; the economist measures the effects of technology and the disruption it has, or the philosopher tries to worry about the values that are disrupted in some way by technology.


How AI Can Transform Customer Experience By Listening Better to the Voice of Customers

While financial dealings, business transactions, and operational updates can be quantified and computed upon, the same cannot be said of human interactions. With natural language being the free-flowing mode of communication amongst people, the spoken and written words contain a treasure trove of information. And today, this remains largely under-leveraged. Whether it is periodic customer surveys, chatter on social media, feedback on review websites, interactions through contact centers, or ongoing communications with customer service professionals, all these touch-points are peppered with vital clues that can help answer the million-dollar question, “What do customers really want?” However, many enterprises use archaic approaches to customer survey and digital listening programs. Textual feedback from these programs is often subjected to superficial text analytics that don’t go beyond simple text summaries, frequency counts of words, or naive sentiment analysis. These squander valuable customer signals, falling short on intelligence and actionability.


The Future of A.I. Isn’t Quite Human


At first glance, an A.I. brought to life on the red carpet may feel jarring. But A.I. is already operating in many aspects of our lives: It controls your Facebook news feed, it helps make your salad, and it opposes you in video games. And while a fleet of Protoss carriers gliding across a choke point in Starcraft II may appear less “real” than Shudu in her gown on a real red carpet — or the virtual avatars created by Facebook and spotlighted in a Wired feature last week — cutting-edge work happening behind the scenes in these virtual worlds may actually say quite a bit more about an emerging universe of the almost-human, where the line between person and machine blurs. After all, Google wouldn’t spend upwards of $500 million on nothing. The company’s DeepMind property uses advanced algorithmic learning to mimic and surpass human play style in games, but that’s nothing compared to what’s coming. “This is not going away,” Morgan Young, the CEO and co-founder of Quantum Capture who worked on Shudu’s BAFTAs project, tells OneZero. “This is just the beginning of how powerful characters can be when they’re combined with A.I.”


Cyber Crime Competes Against the Good Guys for Talent

One factor that has benefited cyber crime is the professionalization of the threat space. Previously more disparate, the underground functions very much like legitimate businesses operating under a “supply and demand” philosophy. Product/service competition and as-a-service offerings fuels the growth of the maturing marketplace. This forces developers and sellers to provide quality merchandise at competitive prices. An aggressive marketing strategy helps gain market share with favorable reviews from customers and forum administrators providing corroboration of production utility and the bona fides of sellers. It is common for sellers to offer 24×7 help support, as well as customizable features to prospective customers. Moreover, the goods and services provided in the underground are not exclusively tailored for experienced cyber crime actors. Some products target inexperienced customers thereby lowering the bar to gain entry into cyber criminal operations. This allows anyone that can pay the price point to engage in hostile activities, either on their own via user-friendly graphic user interfaces, or just paying for the service, hiring “professionals” to do the job.


Mirai Botnet Code Gets Exploit Refresh

Mirai Botnet Code Gets Exploit Refresh
In the latest version of Mirai, meanwhile, Palo Alto's Nigam says researchers found two unexpected exploits: one for the WePresent WiPG-1000 Wireless Presentation system and another for a content management system developed by LG to manage screen-based signage. Neither of the exploits had been seen in the wild before. Both types of software are most likely to be used by businesses. "In particular, targeting enterprise links also grants it access to larger bandwidth, ultimately resulting in greater firepower for the botnet for DDoS attacks," Nigam writes. The exploit for LG targets a vulnerability (CVE-2018-17173) in its LG SuperSign EZ CMS 2.5, which ships as part of LG's WebOS operating system in smart TVs. The vulnerability was disclosed in September 2018. The exploit in WePresent attacks a command injection vulnerability. The vulnerability was contained within several versions of software in WePresent WiPG-1000 devices, which are wireless routers designed for screen sharing. Barco, the device's developer, has patched the vulnerability.


Organizations need to make mobile security a priority in 2019


The challenge is, WiFi relies on mostly insecure protocols and standards, making them easy to impersonate or intercept, mislead and redirect traffic. This can be done independently on how new or updated your device is; it’s only related to how the underlying WiFi infrastructure works. There are times when you don’t even need to perform any action to have an attack on you perpetrated. Do you remember that WiFi network you connected to while having lunch the other day? In order to make your life easier, your device will connect to it automatically if it recognizes the network. Even when it’s not the same network, it just has to claim to be it. From over in the corner, the hacker effortlessly hijacks your session, captures your credentials, delivers a targeted exploit, and assumes full control of every function on your smartphone - including those that login to your company’s Wi-Fi and send emails in your name. This year, Zimperium attended Mobile World Conference (MWC) in Barcelona and RSA in San Francisco - - the attendance for the two shows combined was more than 150,000 executives, salespeople, media, etc.



Quote for the day:


The essential question is not, "How busy are you?" but "What are you busy at?" -- Oprah Winfrey


Daily Tech Digest - March 19, 2019

Time-series monitoring helps ops teams predict long-term trends based on patterns found in historical data. This type of monitoring digs into past metrics to forecast what's likely to occur next in a system. Organizations can use time-series monitoring to predict trends around autoscaling, required capacity and more. The time-series method can also support more accurate troubleshooting due to the expansive range of data collected over time. ... APM tools track the performance of enterprise software. They monitor an application over time, offering data around memory demands, code execution, network bandwidth and disk read/write speeds. Admins can use this data to evaluate how an app's dependencies might affect its performance and to pinpoint the cause of any performance issues ... This category of monitoring measures IT performance, based specifically on the perspective of end users. It might track, for example, the response time of a virtual desktop or other user-facing applications. 


How a small company can make use of data image
Clearly, the effective use of AI and data science can be game-changing, but it is not only small businesses that are struggling to deploy data science. The same report reveals that 51% of UK leaders admit their organisation does not currently have an AI strategy in place. Incorporating data science into an operating model is as big a challenge as it is an opportunity, but what are the issues specific to smaller businesses and are they so very different to larger companies? Data can be tricky to make sense of and, though start-ups or small organisations may not be sitting on the same volume of data as their larger counterparts, the variety and velocity will often be comparable. Precisely because small companies are often competing with larger, better-resourced competitors, it’s often absolutely critical that they can quickly make use of their data. The good news is that smaller organisations are in some ways better able to do this than larger, better-established ones. 


With Pulumi, you can create, deploy, and manage any cloud resource using your favorite language. This includes application- and infrastructure- related resources, often in the same program. One area this gets really fun is serverless. Because we're using general purpose languages, we can create resources, and then wire up event handlers, just like normal event-driven programming. This is the way serverless should be! In this article, we'll see how. There's a broad range of options depending on what you want to do, and how your team likes to operate. We'll be using AWS and TypeScript, but other clouds and languages are available. ... Serverless app models today make you think of the event sources -- the S3 buckets -- and event handlers -- the Lambdas and associated code -- as very different things -- "infrastructure" versus "app code" -- managed with distinct tools and workflows. Pulumi, in contrast, gives you a single CLI, pulumi, to manage everything consistently.


Autism, Cybercrime, and Security's Skill Struggle

Rebecca Ledingham, vice president of cybersecurity at Mastercard, spotted the trend earlier in her career as a cyber agent for the UK's National Crime Agency. "They weren't the kinds of offenders I was used to dealing with in drugs and sex crimes," she said in an interview with Dark Reading. Their social behavior, she said, was different from what she'd seen in other areas of crime. Often, she continued, cybercriminals are first diagnosed as being on the autism spectrum during the criminal justice process. Later in her career, as a cyber agent for INTERPOL's Global Complex for Innovation (IGCI), she realized the issue was broader. Ledingham's work with global agencies revealed outside of cybercrime, no other offense came with a foundational condition. "There's no other organic set of offenders that may be predisposed to cybercrime due to the nuances of their disorder," she said. Autism presents itself at the age of two or three, and more than 17 million people worldwide are diagnosed, said Ledingham in an RSA Conference talk. 


Middle East tech: Nine things the region must do to safeguard its financial future


The outlook for growth in the Middle East, North Africa region is expected to improve slightly in 2019 and 2020, the World Bank reported last June, noting a range of factors including "a favorable global environment, post-conflict reconstruction efforts, and from oil importers' reforms to boost domestic demand and increase foreign investment". Although welcome, these conclusions do not mask the longer-term economic realities that the region needs to address. The fourth industrial revolution driven primarily by AI and automation is going to radically change how we work, rest and play, bringing about major shifts to societies and economies. States such as Saudi Arabia, Qatar, UAE, and Oman have acknowledged this upheaval, with bold policy documents identifying a new vision for their counties. But the transition to these new digital realities will inevitably be haphazard and uncertain.


EU law enforcement agencies prepare for major cyber attacks


The newly adopted EU Law Enforcement Emergency Response Protocol determines the procedures, roles and responsibilities of key players both within the EU and beyond, including secure communication channels and contact points for the exchange of critical information as well as a coordination and de-confliction mechanism. The protocol is designed to complement the existing EU crisis management mechanisms, said Europol, by streamlining transnational activities and facilitating collaboration with the relevant EU and international players, making full use of Europol’s resources. It further facilitates the collaboration with the network and information security community and relevant private sector partners. Only cyber security events of a malicious and suspected criminal nature fall within the scope of this protocol. It will not cover incidents or crises caused by a natural disaster, man-made error or system failure.


3 ways AI is already changing medicine


Take ophthalmology. The top cause of loss of vision in adults worldwide is diabetic retinopathy, a condition that affects about a third of people with diabetes in the US. Patients should be screened for the condition, but that doesn’t always happen, which can delay sometimes diagnosis and treatment — and lead to more vision loss. Researchers at Google developed a deep learning algorithm that can automatically detect the condition with a great deal of accuracy, Topol found. According to one paper, the software had a sensitivity score of 87 to 90 percent and 98 percent specificity for detecting diabetic retinopathy, which they defined as “moderate or worse diabetic retinopathy or referable macular edema by the majority decision of a panel of at least seven US board-certified ophthalmologists.” Doctors at Moorfields Eye Hospital in London took that work a step further. They trained an algorithm that could recommend the correct treatment approach for more than 50 eye diseases with 94 percent accuracy.


Is it time we raised expectations of politicians on cyber security? image
Fortunately, not all MPs today are as dismissive of the cyber security threat as they may have been in the past. Sir David Amess provided an example from his constituency in Southend West, where he described “cybercrime having a devastating impact on individuals and businesses.” Amess spoke of a not-for-profit organisation being bankrupted as the result of a data breach – an all-too-familiar occurrence in recent years. MPs themselves are not immune to suffering data breaches. Onwurah explained how her office was a victim of a cyber-attack, but fortunate that it did no real damage. “As an MP’s office we had a big department supporting us and there was no compromise of constituents’ data,” Onwurah remarked. “If we had been a small business, we wouldn’t have had access to that kind of support, and it could have put us out of action for a lot longer.” This is undeniably true, as data breaches have become extinction events for many businesses.


password spray IMAP attack
Legacy protocols (such as POP and IMAP) make it more difficult for service administrators to implement authentication protections like multi-factor authentication, according to Proofpoint. In turn, the lack of multi-factor authentication means that threat actors launching attacks through IMAP can avoid account lock-out and compromise accounts unnoticed. “Attacks against Office 365 and G Suite cloud accounts using IMAP are difficult to protect against with multi-factor authentication, where service accounts and shared mailboxes are notably vulnerable,” researchers said. IMAP-based password-spraying campaigns appeared in high volumes between September 2018 and February 2019, according to the report, especially those targeting high-value users such as executives and their administrative assistants. “Targeted, intelligent brute-force attacks brought a new approach to traditional password-spraying, employing common variations of the usernames and passwords exposed in large credential dumps to compromise accounts,” researchers said in a posting.



Cybersecurity: Why bosses are confident, and tech workers are scared


At the top of business, there seems to be a lot of self-congratulatory box ticking, while elsewhere in the organisation there is a nagging sense that something very bad is about to happen. Two recent pieces of research reflect the ongoing disconnect. The UK government's annual survey of cyber security at big businesses shows that awareness of cyber risk is growing at the top of business. Nearly three quarters of firms said their board sees the risk of cyber threats to be high or very high, in comparison to all risks that they face. And nearly all FTSE 350 companies now have a cybersecurity strategy, even if only half of them will actually back up those fine words with cold, hard cash. Similarly, nearly all have a cybersecurity incident-response plan, even if only 57 percent actually test them on a regular basis. And yet, a separate survey by security company LogRhythm of 1,500 IT professionals in big businesses, shows that while the board may feel it is in control, the tech workers themselves are deeply worried.



Quote for the day:



"If you're not failing once in a while, it probably means you're not stretching yourself." -- Lewis Pugh