Daily Tech Digest - October 12, 2017

App Economy Talent
Traditionally security was an afterthought in the development cycle, but over the past few years, it’s quickly become a core part of the process. Now aptly called DevSecOps, the process incorporates security earlier into the development and testing software phases as a means to achieve faster, higher quality outcomes that are both innovative and secure. While DevSecOps is growing in popularity, organisations are still struggling to combat malware injections or data breaches, because their developer and IT teams don’t have the security knowledge or skills needed to launch products threat-free. ... With more than half of organisations using DevOps practices across their business or within teams, the personal debt is bound to have a real impact on the productivity of businesses, the safety of its products, and the quality of applications that ultimately form the foundation of today’s digital economy.


Cybersecurity is key for the smart cities of tomorrow

Without a secure cyber foundation, smart cities will crumble. Built on a secure cyber foundation, smart cities will thrive. We were encouraged to see that the proposed legislation specifically focused on developing a “skilled and savvy domestic workforce to support smart cities.” At the heart of the secure smart cities of tomorrow will be a dynamic IT workforce, confident and capable of training and re-training on a consistent basis to stay ahead of the latest threats. Our research shows that just 35 percent of government officials believe that their organizations are well equipped to handle the cyber requirements of smart city projects. Moreover, 40 percent of government officials and personnel cite skills gaps and a lack of necessary technology expertise a primary concern affecting the expansion of smart city initiatives.


Enterprises 'radically outstripping' traditional technology: Nokia

nokia-communications-needs.png
"Nokia is seeing a watershed moment for the industry as significant global trends and changes to the cost base of what we have traditionally considered carrier capabilities are changing the dynamic of the way carriers are addressing the business market for telecommunications," he added. "The challenge now is for our industry -- carriers and suppliers -- to meet business halfway and ensure they understand that we are here to contribute to their future. "Industrial network requirements are rapidly shifting, and networks are changing to meet those needs." Labelling 5G as more than just the next evolution of the traditional network, Conway said it will accelerate this transformation of industry. "5G is specifically being designed to cater for the tens of billions of devices expected for our automated future," he said.


Cyber threats are among top dangers, says Nato

One of the biggest challenges is bringing innovation faster in Nato’s approach to cyber defence, he said. “This is one of the objectives where we still need to push a little harder,” he added. Ducaru said recognising cyber space as an operational domain requires a change of assumption. Previously, Nato worked under the assumption that it could rely on its systems and the integrity of the information, he said. “We concluded that this assumption was no longer valid, and that we needed to change our training, education and planning with the assumption that systems will be disrupted, that we will constantly be under cyber attack, and that we will need to achieve missions under these conditions,” he said. As a result, Nato has switched its focus from “information assurance” to “mission assurance” to support essential operations.


Converging big data, AI, and business intelligence

AI BI cross separate domains and systems
Although different GPU-based database and data analytics solutions offer different capabilities, all are designed to be complementary to or integrated with existing applications and platforms. Most GPU-accelerated AI databases have open architectures, which allow you to integrate machine learning models and libraries, such as TensorFlow, Caffe, and Torch. They also support traditional relational database applications, such as SQL-92 and ODBC/JDBC. Data scientists are able to create custom user-defined functions to develop, test, and train simulations and algorithms using published APIs. Converging data science with business intelligence into one database, allows you to provide for the criteria necessary for AI workloads, including compute, throughput, data management, interoperability, security, elasticity, and usability.


Olympic Games Face Greater Cybersecurity Risks

While most of the past attacks on sporting events center on IT systems at stadiums and ticket sales and operations, future cyberattacks at the Olympics may occur in eight key areas, says Cooper. The areas include cyberattacks to facilitate terrorism and kidnappings and panic-induced stampedes; altering scoring systems; changing photo and video replay equipment; tampering with athlete care food dispensing systems; infiltrating monitoring equipment; tampering with entry systems; and interfering with transportation systems. "I was surprised to learn there are instances where human decisions are overridden by technology," Cooper said, in reference to a growing reliance on using technology to make the first call in a sporting event, rather than a human referee. She pointed to the reliance of electronic line-calling technology Hawk-Eye that is used in such sports as tennis.


Why Machine Learning and Why Now?


Although machine learning has already matured to the point where it should be a vital part of organizations’ strategic planning, several factors could limit its progress if leaders don’t plan carefully. These limitations include the quality of data, the abilities of human programmers, and cultural resistance to new ways of working with machines. However, the question is when, not if, today’s data analysis methods become quaint relics of earlier times. This is why organizations must begin experimenting with machine learning now and take the necessary steps to prepare for its widespread use over the coming years. What is driving this inexorable march toward a world that was largely constrained to cheesy sci-fi novels just a few decades ago? Advances in artificial intelligence, of which machine learning is a subset, have a lot to do with it.


Creating a Strategy That Works

Distinctive capabilities are not easy to build. They are complex and expensive, with high fixed costs in human capital, tools, and systems. How then do businesses such as IKEA, Natura, and Danaher design and create the capabilities that give them their edge? How do they bring these capabilities to scale and generate results? To answer these questions, we conducted a study between 2012 and 2014 of a carefully selected group of extraordinary enterprises that were known for their proficiency, for consistently doing things that other businesses couldn’t do. From dozens suggested to us by industry experts, we chose a small group, representing a range of industries and regions, that we could learn about in depth — either from published materials or from interviews with current and former executives.


Understanding the hidden costs of virtualisation

Costs Virtualisation
Today, data underpins business continuity and therefore user expectations for server uptime are higher than ever before. More than at any time, the prospect of downtime is punishing for a company’s reputation and bottom line, meaning it must be avoided. This places added pressure on IT administrators to keep all machines up and running. Ideally, a fully dynamic and optimised infrastructure is achieved by an IT admin carefully running through a checklist or policy each time a new virtual machine (VM) is “spun up”. In reality, IT administrators are extremely strapped for time and can no longer afford to manually go through checklists. Instead, they are spending their resources on keeping the data centre lights on by ensuring users have access to the data and files they need to keep the business moving forward.


Much GDPR prep is a waste of time, warns PwC

Although some organisations claim to be following a risk-based approach to GDPR compliance, Room said that if that activity is not “anchored to a taxonomy of risk”, the activity is “purposeless”, and purposeless activity is one of the quickest ways of being hit by enforcement action, he said. For organisations that have not done any GDPR preparation with just seven months to go before the compliance deadline of 25 May 2018, Room said the biggest risk is that all the third-party service providers that could help have already been snapped up and are working to capacity. In addition to legislative compliance risk, there is also the risk of failing to deliver a GDPR programme, he said, and regulator risk because the Information Commissioner’s Office and all the other EU data protection authorities also form part of the spectrum of risks.



Quote for the day:


"The final test of a leader is that he leaves behind him in other men, the conviction and the will to carry on." -- Walter Lippmann


Daily Tech Digest - October 11, 2017

the-ultimate-devops-mastery-bundle.jpg
Development-centric folks tend to come from a mindset where change is the thing that they are paid to accomplish. The business depends on them to respond to changing needs. Because of this relationship, they are often incentivized to create as much change as possible. Operations folks tend to come from a mindset where change is the enemy. The business depends on them to keep the lights on and deliver the services that make the business money today. Operations is motivated to resist change as it undermines stability and reliability. When that disconnect is repaired, not only will both sides be less frustrated, they're able to work much faster. For the enterprise as a whole, DevOps improves both business agility and IT alignment. In other words, businesses can deliver improved services to users while using IT to more efficiently achieve business objectives.


CIO interview: Lee Edwards, IT director, NHS Shared Business Services

Robotic process automation (RPA) will be a key area of focus for Edwards’s team over the coming year when it comes to digital initiatives. This includes a pilot and implementation of RPA and chatbot technology across a number of operational areas.  “This will bring significant efficiencies to our back office functions and help improve customer services through improved speed and accuracy of processing,” says Edwards, adding that recent proofs of concept with RPA technologies such as Kapow and Blue Prism, integrated into a Microsoft Azure technology stack.  According to Edwards, there is a lot of potential in joining RPA and technologies such as artificial intelligence (AI) and big data analytics to bring noticeable improvements to back-office processes and customer engagement experience at NHS SBS.


Why Your Business May Not Be Ready for AI

Image: Shutterstock
"Generally, the biggest impediment we see out there is the inability to create a portfolio of initiatives, so having a team or a number of teams coming back and saying, 'These are the 50 things I could do with AI based on what AI is able to do today and in the next 12 months,' and then [it's up to senior management to] prioritize them," said Giacomelli. "You need to have people going through the organization, unearthing places where value can be impacted." Over the last three decades or so, business leaders have been setting strategy and then implementing it, which isn't going to work moving forward. The AI/human equation requires a hypothesis-driven approach in which experiments can fail fast or succeed. "It's a lot more about collective intelligence than let's get a couple of experts and let them tell us where to do this. There are no experts here," Giacomelli said.


Artificial Intelligence: Experts Talk Ethical, Security Concerns

"What's core is making sure there's clear accountability, and being concerned with the types of controls we seek in AI," Sharkey continued. There is a need for deep learning, and deep reinforcement learning, as we seek AI applications in child care, elder care, transport, and agriculture. "Future-proofing" AI should consider its implications for human rights. "Artificial intelligence transforms everything around us; every industry, our health, our education," explained Aleksandra Przegalinska-Skierkowska, assistant professor at Kozminski University and research fellow for Collective Intelligence at MIT. "Especially if we want autonomous vehicles or virtual agents, we need a code of conduct for them." We are at a point when people have begun to reflect on issues related to machine ethics and morality, she added.


Business needs help to act on cyber security advice


“Part of our role is also to cut through the noise and show businesses that the things that will make a difference for the majority of small businesses cost little or nothing to implement.” Many of the things small businesses can do to improve their cyber security only have a cost in time and effort, said Unsworth. “Cyber security is not always about buying a technical solution,” he added. Investments in security technologies depend on the size of the business, the business operating model and what the business is trying to achieve, he said. “So for businesses that handle sensitive information, there is a cost because they need to ensure that data is protected and demonstrate that they have a good security posture.” The role of the London DSC is to identify and prioritise business needs in terms of cyber security controls, said Unsworth.


Is Your Company Ready to Face Tomorrow’s Security Risks?

IT security strategy should take into account a few important factors, such as business and corporate strategy, IT strategy, compliance and standards, regularly repeated analysis of threats, risks and current security state. The starting point for building an IT security strategy should be the determination of goals and direction of the company and its business. Then, the assessment of the current security state should take place. Within this assessment, deep knowledge of the company, its processes, functions and business is needed. The security strategy should always be compatible with the business and company strategy, taking into account future plans and products. If we know where we are, and we understand where the company is heading, we can start working on specifying the desired state of the company’s security and methods


Businesses and their employees admit to GDPR confusion

GDPR Employee Confusion
It is not just inside the office where people need to consider compliance, as 30% of people admitted to having viewed someone else’s laptop during their commute. Businesses must ensure that even when working on the go, their employees are preventing others from obtaining information on their customers and contacts. The new regulations will protect consumers against companies that hold inaccurate and unneeded data about them, as well as ensuring greater emphasis is put on prominent and unambiguous customer consent with the ability to withdraw at any time. ... “Despite the impending GDPR deadline, our research shows that many companies don’t appear to have systems and policies in place to protect sensitive information. If this data is then stored illegally – or falls into the wrong hands – the damage caused to the organisation could be irreparable.”


SailPoint survey reveals cyber risks are leaving IT in the dark

According to SailPoint’s 9th annual Market Pulse Survey, senior IT decision makers say that more cyber risks are being created outside of the IT department’s visibility; yet it remains IT’s responsibility to mitigate these risks. The survey found that organisations need to better define and enforce corporate policies company-wide, addressing risks like shadow IT and bring your own device (BYOD) given today’s increasingly mobile, agile workforce. Recent sprawling attacks like WannaCry that affected organisations worldwide, as well as direct attacks on organisations of all sizes and in all industries, have demonstrated the significant organisational damage they cause. In the wake of data breach pandemics at levels seen over the past year, most organisations should take stock of the security controls they currently have in place


Artificial Intelligence Threats and Promises

Pepper robot and young girl
Artificial intelligence is unleashing the next wave of digital disruption. Early adopters are already creating competitive advantages. Companies that combine a strong digital foundation with an aggressive adoption of AI are well poised to become leaders in their industry. This potentially allows old-world companies a way to remain relevant in the wake of global competition. The adoption of AI is rapidly reaching a tipping point. A Narrative Science survey last year found 38 percent of enterprises are already using AI, growing to 62 percent by 2018. Forrester Research predicted a greater than 300 percent increase in AI investments from 2016 to 2017. IDC estimated the AI market will grow from $8 billion in 2016 to more than $47 billion in 2020.


Five Key Differences Between Personal and Enterprise Endpoint Security

Most home computer users have administrator rights on their devices. Administrators can install software, which means that attackers can install malware by compromising these credentials. People working in corporate environments rarely have administrator access to their own machines, so they cannot update the operating system or critical programs on their own. As a result, phishing and social media attacks are somewhat more likely to succeed in a consumer environment. However, attacks that manage to infect corporate devices are much more difficult to remove and usually require intervention from those responsible for centralized computer management. If a vendor issues a patch for a significant vulnerability or newly discovered exploit, those responsible for updating systems must do so quickly. 



Quote for the day:


"The actions of a responsible executive are contagious." -- Joe D. Batton


Daily Tech Digest - October 10, 2017

IT spending increases for software-defined storage, on-demand services
SDS is gaining popularity because of its versatility in a modern data center. Enterprise storage has been migrating from hardware-defined arrays as data centers migrate to virtualization and cloud-based infrastructure. SDS solutions run on commodity hardware but use virtualization and all functionality, such as provisioning and de-duplication, via software. This adds automation and thus speed to storage networks. "For IT organizations undergoing digital transformation, SDS provides a good match for the capabilities needed — flexible IT agility; easier, more intuitive administration driven by the characteristics of autonomous storage management; and lower capital costs due to the use of commodity and off-the-shelf hardware," said Eric Burgener, research director at IDC, in a statement.


Rise in Insider Threats Drives Shift to Training, Data-Level Security

With an insider threat, the culprit is already inside the network. Securing the perimeter around the network — which has long been the focus for enterprise security — does not do the job against this kind of a threat, whether it is malicious or unintentional. Nor is focusing on securing the perimeter the best strategy against many external threats. That's because data-smart companies want to be able to safely give partners, suppliers, and customers access to their networks in order to increase business opportunities. As a result of this shift, security needs to rest with the data itself, not just at the network level. The move to the cloud elevates the need for data-level protection. To reduce the risk of insider threats, companies and organizations need to focus on three areas


Understanding Cloud Native Infrastructure: Interview with Justin Garrison and Kris Nova

A major benefit of public cloud comes from process rather than performance. ​The people hours you can save from becoming an infrastructure consumer rather than an infrastructure builder will be very difficult to calculate but will likely enable a new method of working that far outweighs the technical limitations of a public cloud. Not to mention some of the best infrastructure builders and maintainers in the world work at public cloud providers and the companies behind them spend billions every year building out the infrastructure, R&D, and new features. ​The biggest considerations when building your own cloud is not what it will cost you to build the private cloud, but what it will cost you to maintain it and what happens when you fall behind public cloud offerings.


Make Cybersecurity A Priority in a Small Business’ Early Stages


The need for strong passwords is crucial for cybersecurity, no matter how often we groan about having to change (and remember) a new one. Shubhomita Bose writes about this and data from Headway Capital for smallbiztrends.com. The Headway infographic emphasizes having a company policy to avoid “weak” passwords, to change passwords on a regular basis, and to incorporate “two-factor authentication” — as some businesses are now doing with an additional text-message step in the password process. This is an increasingly significant threat to cybersecurity. ... As Anita Campbell, CEO of Small Business Trends, writes for Inc.com, “The ransom is displayed on the screen with a message stating you must pay a fine or fee in order to access your own system. Ransoms have ranged from hundreds of dollars to tens of thousands of dollars.”


Leaving employees to manage their own password security is a mistake

“Far too many organizations are leaving the responsibility for password management to their employees and don’t have the automated password management technology in place to identify when things are going wrong.” “In many cases, an organization’s password management practices are overly reliant on manual processes and far too often place an excessive level of trust in employees to use safe password practices,” said Matt Kaplan, GM of LastPass. “The threat posed by human behavior coupled with the absence of technology to underpin policy is leaving companies unnecessarily at risk from weak or shared passwords. Organizations need to focus on solving for both obstacles in order to significantly improve their overall security.”


How IPv6 deployment affects the security of IoT devices


As a result of their vast address space, IPv6 devices are provisioned with at least one unique global address and, thus, NATs are doomed to disappear. Therefore, a NAT's enforcement of the filtering policy to only allow outgoing communications is also likely to disappear, meaning communication between internal and external systems may no longer be policed by the network. In fact, the distinction between internal and external networks may disappear altogether if a filtering policy is not enforced at the network border. While this could have potential benefits -- for example, for peer-to-peer applications, in which unsolicited inbound communications are common -- this clearly comes at the expense of increased attack exposure.


Organizational Culture Needs To Change So That Security And DevOps Can Exist In Tandem

Cloud adoption often started to be called in terms of ‘shadow IT’ or ‘bypass IT’. So cloud adoption often occurred outside of the mainstream IT and mainstream IT security groups. So in a sense IT and IT security are still playing catch up to the original adoption of cloud. Even if they have been given responsibility for it now. And we have started to see that change. 2 years ago even in the US often it was – we were working with those ‘shadow IT’ projects. Now the responsibility is more moving into IT and IT security. So they’re bringing the traditional mindset. I think the remaining roadblock that you are still getting is the developer pipeline is moving at a much faster pace than it did historically where application introduction used to occur maybe in months


Intel plans hybrid CPU-FPGA chips

Intel plans hybrid CPU-FPGA chips
“The advantage for FPGA is GPUs play in some areas but not all, and if you look at the use model of inline vs. offload, they are limited to offload mostly. So, there’s a broader application space you can cover with FPGA,” he said.  The integrated solution provides tight coupling between CPU and FPGA with very high bandwidth, while the external PCI Express card is not as tightly coupled. For ultra-low latency and high-bandwidth applications, integrated is a great fit, Friebe said.  “Most of the differentiation [between integrated and discrete] is due to system architecture and data movement. In a data center environment where [you] run many different workloads, you don’t want to tie it to a particular app,” he said.  The more you do specialization, the more performance you can squeeze out of the accelerator, said Friebe. 


The future of mobility: Are we asking the right questions?

One of the categories requiring the sharpest questions about the future is mobility. The mobile present has many moving parts and is very complex, but base patterns are discernible. I believe every human on this planet needs at least to attempt to comprehend the current point to which the mobile revolution has brought us. Furthermore, I believe modern executives have a fiduciary responsibility to think long and hard about where the mobile revolution is taking us.  The most rapidly adopted consumer technology in the history of mankind, mobile technology has had a huge economic impact — more than $1 trillion — and has changed the corporate competitive landscape as well as how people live their daily lives. Some go so far as to argue that mobile technologies have changed what it is to be human.


Detecting and Analyzing Redundant Code

A typical analysis would involve running the tool repeatedly to prune back the source tree as brutally as possible. This was then followed by several cycles of reverting changes so as to get successful builds and then passing tests. The reasons for failure being that the tool had behaved incorrectly or there was a known limitation, examples of the latter being reflection or the existence of a code contract. The tool was trained on various GitHub repositories for C# projects that were chosen on the basis that I had used them and thus wanted to contribute back. Ultimately a pull request was submitted to the community asking for discussion of the changes in my branch. As the tool is brutal and I was engaging online with people for the first time this was where diplomacy was required and hopefully I didn’t offend too many people.



Quote for the day:


"When you're around someone good, your own standards are raised." -- Ritchie Blackmore


Daily Tech Digest - October 09, 2017

How devops changes monitoring
Devops is speeding up the entire application lifecycle, from development to QA to production. Relatively static production applications
are now being updated as frequently as multiple times a day. This leads to many challenges, some old and some new. Developers have had to adapt by writing more comprehensive automated tests for their code,
so that QA is as automated as possible. QA has become dependent on continuous integration, which automatically runs all of the unit and integration tests whenever new code is committed. Monitoring systems are now becoming more aware of every part of the devops toolchain. Before devops, new application updates would be carefully administered by highly skilled technicians.


How the Finance Industry Uses Cloud to Streamline Compliance

Regulatory requirements in the U.S., including the Payment Card Industry Data Security Standard (PCI DSS), the Gramm-Leach-Bliley Act (GLBA) and other data privacy laws, like Europe’s GDPR - will dramatically impact U.S. companies doing business in Europe. As a result, financial institutions not only need to protect data, but they need to be able to regularly report on the effectiveness their data protection measures.  Compliance is a major consideration of financial companies, with detrimental fines and customer loyalty on the line. According to a 2015 SANS Survey, 72 percent of respondents claimed maintaining compliance in the cloud as their biggest concern across all types of cloud deployment.  Nonetheless, the cloud has proven to be an optimal environment for facilitating compliance


Are You Ready For A Data Protection Regime?

Are you ready for a data protection regime? - IT Next
Creation of a data protection regime—is by far the most obvious and immediate fallout—something which the honorable judges have pointed out as well. Protected personal data is one of the most important means to protect an individual’s privacy. Ensuring it is becoming a tough challenge in this era of digital everything. “We are in an information age. With the growth and development of technology, more information is now easily available. The information explosion has manifold advantages but also some disadvantages. The access to information, which an individual may not want to give, needs the protection of privacy. The right to privacy is claimed qua the State and non-State actors. Recognition and enforcement of claims qua non-state actors may require legislative intervention by the State,” Justice Sanjay Kishan Kaul said in his judgment.


What you need to know about concurrency models

The advantage of multiprocess is that it is very stable. If a process dies, it doesn’t have to affect anything else. The disadvantages are that starting a new process is pretty expensive, and communicating between subprocesses is cumbersome. There are ways, but honestly there aren’t good ways to communicate between these processes. You have awful ideas like shared memory, operating-system-specific interprocess communication methods, and actual remote procedure calls (RPC). Frankly, having a server open up socket connection connections to itself is gross. No modern software should be written this way. There was a time that Linux and other platforms didn’t actually support native threads and had to fork off a subprocess for concurrency. So supporting multiple platforms meant you had to do these torturous things. But in the modern era, there are better ways.


Hitachi preparing AI for enterprise storage

istock-666855924.jpg
In storage, software intelligence will probably start with more mundane concerns such as capacity planning, what will break, and when it will break, Vantara senior vice president Iri Trashanski said. It falls under the categories of application performance management and infrastructure performance management, he emphasized. There is already an AI focus in the IoT division's Lumada software. "If you build it correctly it can be used across groups," Trashanski noted. ... "This is a pretty broad topic. AI can be applied to a number of areas in storage," Rogers said by email. "On the operational side, we are exploring automation root cause analysis and resolution tasks. Right now if storage breaks in the middle of the night then someone is woken up to [find the] root cause and resolve. We think that the system should be able to learn how to resolve these challenges on its own," he wrote.


CSO Interview: Must-have Skills for New-age Cyber Security Professionals

Explaining his viewpoints on combating such vector threats and his overall security strategy for the Bank of Sharjah, Mani informs that he is a firm believer of multi-layered security arrangements. "I prefer to architect the security eco-system of my organization to be a multi-layered defense in depth approach. This starts with a governance layer, then continues with physical & environmental security, network security, infrastructure security, systems security, application security, database security and operating system security. This is how I have architected the security strategy of my organization. To add more value to this Enterprise Security Architecture, I have layers of support provided by my internal audit and risk management teams who review the cyber security practices of my organization in periodic basis to give an assurance on the robustness of the cyber security practice."


Google's Clips Camera Offers A Snapshot Of Things To Come

google clips
Google Clips uses artificial intelligence (A.I.) to choose when to take pictures. To “use” the camera, you twist the lens to get it started, place it somewhere, then forget about it. It learns familiar faces, then favors those people (and pets!) when deciding when to take pictures. It looks for smiles and action, novel situations and other criteria. It discards blurry shots. Each time it takes pictures, it captures a burst of photos at 15 frames per second, which you can use or edit as a GIF or from which you can cherry-pick your favorite still photographs. The Clips has no microphone, and it cannot record sound. In short, the A.I. is designed to take great pictures and GIFs, but with the advantage of taking pictures where there’s no photographer around to change the actions of the photographed.


CIOs, IT workers don’t always see eye-to-eye on priorities, challenges

DelPrete said when it comes to the cyber threat, respondents were most concerned about aging or old systems, followed by human error, malware and phishing attacks. “As we talked to the CIOs and asked them what they are doing about this, I think a renewed emphasis, and it seemed like some improved results, as it relates to cybersecurity awareness training,” he said. “Many felt that the agency was a little bit more effective in identifying and mitigating cyber risk today than they were a year ago, largely as a result around the dialogue they have and the training.” The survey showed simple changes like requiring employees to answer cybersecurity awareness questions before they can log on to their computer, or creating mock phishing campaigns to help employees recognize potential threats.


What is Kaspersky's role in NSA data theft?


What's more likely is that the Kaspersky product detected one of the NSA's hacking tools, taken out of the safe confines of the agency's offices, and was flagged by the software. "Experts said the software, in searching for malicious code, may have found samples of it in the data the contractor removed from the NSA," said the report. It's not surprising that Kaspersky would recognize either known malware or hacking tools, or malware-type behavior. The company, like others, already has a wealth of antivirus signatures of leaked NSA hacking tools at its disposal, including infamous malware strains like Stuxnet -- which researchers believe was developed by the NSA. Hacking tools used by the so-called Equation Group, thought to be a hacking unit within the NSA, were publicly exposed by a hacking group


Put Humans at the Center of AI

We need to be much more human-centered. If you look at where we are in AI, I would say it’s the great triumph of pattern recognition. It is very task-focused, it lacks contextual awareness, and it lacks the kind of flexible learning that humans have. We also want to make technology that makes humans’ lives better, our world safer, our lives more productive and better. All this requires a layer of human-level communication and collaboration. ... What makes humans unique is that evolution gave us the most incredible and sophisticated vision system, motor system, and language system, and they all work together. Visual Genome is exactly the kind of project that’s pushing the boundaries of language understanding and visual understanding. And eventually we’re going to connect with the world of robotics as well.



Quote for the day:


"Doing easily what others find difficult is talent; doing what is impossible for talent is genius." -- Henri-Frederic A.


Daily Tech Digest - October 06, 2017

Come to grips with immutable infrastructure in production

Immutable infrastructures often are refreshed frequently or run for relatively short periods, unlike traditional servers that host workloads that are online for years. The longer an instance runs in the data center environment, the higher the chance of an unanticipated configuration change. This rarely crashes an instance or renders a workload unavailable, but likely impairs performance or returns errors. An immutable infrastructure should support automatic application scaling as traffic demands change. To achieve resilience and allow for failures without application disruption, deploy instances in clusters behind load balancers. Automation processes can scale instances in response to traffic. An application performance management tool monitors objective measures of the workload's operation.


More questions than answers after a third data breach at the NSA is revealed

nsa aerial
As to how Kaspersky ties into this data breach, the WSJ report says U.S. investigators believe the unnamed contractor's use of Kaspersky Anti-Virus (KAV) alerted the Russian hackers to the presence of the files. "Experts said the software, in searching for malicious code, may have found samples of it in the data the contractor removed from the NSA. But how the antivirus system made that determination is unclear, such as whether Kaspersky technicians programed the software to look for specific parameters that indicated NSA material. Also unclear is whether Kaspersky employees alerted the Russian government to the finding," the WSJ reported. One of the major unanswered questions in this story is what caused KAV to hit on these files?


Where is technology taking the economy?

The interesting thing here isn’t the form intelligence takes. It’s that intelligence is no longer housed internally in the brains of human workers but has moved outward into the virtual economy, into the conversation among intelligent algorithms. It has become external. The physical economy demands or queries; the virtual economy checks and converses and computes externally and then reports back to the physical economy—which then responds appropriately. The virtual economy is not just an Internet of Things, it is a source of intelligent action—intelligence external to human workers. This shift from internal to external intelligence is important. When the printing revolution arrived in the 15th and 16th centuries it took information housed internally in manuscripts in monasteries


Security experts air concerns over hackers using AI and machine learning for phishing attacks


In order for enterprises and security suppliers to remain on the front foot with hackers, they will also need to incorporate machine learning and artificial intelligence (AI) into their cyber security strategies, creating what Ghosh terms an “AI on AI” situation. “Security companies that fight these bad guys will also have to adopt machine learning. Now you have an AI on AI scenario, and it will propel us forward to adopt machine learning for real time,” he said. Where the technology comes into its own for enterprises is in the detection of cyber threats, he said. “The volume of data that’s available on certain types of threats like malware is effectively infinite,” he added. “The problem with sticking humans on a malware detection problem is that it’s not a good fit ...”


Put your money where your mouth is

Companies like Netflix, Facebook and Amazon are great examples of how to capitalise on big data and ML capabilities to deliver superior customer experiences. Every organisation has the ability to capture and analyse big data, but it’s how to turn that insight into action that ultimately counts. Think about it this way—you can buy a gym membership, but if you don’t actually go to the gym and use the machines, you’ll never get the desired outcome. Unfortunately, this isn’t the mentality of most organisations, and many purchase ML software but don’t put in the extra effort for it to drive any real business value. Barriers like culture, budget constraints, internal talent, or just a lack of desire to change the status quo have plagued organisations and prevented them from transitioning from “early adopters” to “innovators.”


Leadership in a future of distributed everything


We are fast moving toward a world where hierarchical organizational structures will bend and break. Lots of diverse partners will come together in new ways to create new kinds of organizational structures that will be more fluid, less rigid. While this will feel like a radically new future, and in some ways it is a radically new future, there are deep roots in the past. The future that is about to happen has been brewing for a very long time. Packet switching, the core technology of the internet, was originally called by Paul Baran “hot potato routing,” a much more accurate name. The organization of the future will be shape-shifting, where hierarchies will come and go. Such fluid organizational forms will have no center, they will grow from the edges, and they won’t be controllable.


6 big data privacy practices every company should adopt in 2018

Issues surrounding data privacy are as legally unresolved today as they were two years ago, but the recent Equifax breach now puts a clear focus on them that strikes fear into the hearts of CIOs. The Equifax data that was breached was not big data. However, big data is a major privacy concern for IT because so much of it is coming into enterprise data repositories from so many sources; and it comes in many shapes and sizes. After Equifax, CIOs can rest assured that their CEOs and boards will be following their work in data privacy closely—and big data is one of the areas they'll be most concerned about. What operational steps can IT take to assure at a grass root level that sound data privacy practices are employed for their big data?


The CISO job seems to be finally getting the credit it's due


In the last ten years, another maturation has occurred in the CISO job, from being solely IT-centric to a leader that is now a trusted advisor to the highest levels of the organization in risk management and data protection. Of course, this is not to say that the responsibilities of a CISO are one-size-fits-all. You will find people holding the title covering multiple areas of the security spectrum, including traditional IT security, awareness, disaster recovery, forensics, operational response, endpoints, risk assessment and more. The scope and scale of the role depends upon the organization's size, finances, security focus and risk appetite. I do see an important differentiation occurring in the last few years: a distinct separation between IT security and information security.


Top Digital Experts Share How Current Data Trends can Drive Business Success

With the overwhelming--and growing--amount of big data available today, the need for total automation for collection and analysis is in demand. Many companies are turning to data management platforms or other software solutions to collect, house, sort, and analyze information in a way that's easy for end-users to see and understand. This automation process works to streamline the analysis of data and can also put an end to fragmented data silos across an organization. "The idea of total automation is really popular right now," explains Taylor Wallick, Director of Digital Strategy at Centric Digital. "Digital tools today can allow you to deliver real-time information to various stakeholders throughout an organization without a single person having to dig through the data and build a presentation around it. ..."


How to set up server weight and HTTPS load balancing with NGINX

If you already have basic load balancing setup, you might not know there are a few configuration options you can use to better suit your hardware. Say, for example, you have three servers running NGINX and are setup for load balancing. However, in that triad of hardware, you might have one server that's more powerful than the other two. Instead of having NGINX treat each of those servers as equals, it'd be nice to instruct the load balancer that Web Server 1 should get preference, Web Server 2 should fall next in line, and Web Server 3 should be the last in the chain of preference. NGINX has that very capability, right out of the box. In fact, setting up server priority, with NGINX, is as easy as setting up load balancing. It's called "server weight" and I'm going to show you just how simple this feature is to configure.



Quote for the day:


"Luck is a dividend of sweat. The more you sweat, the luckier you get." -- Ray Kroc


Daily Tech Digest - October 05, 2017

The technologies used by remote workers to access the information they need to undertake their day-to-day operations from home has needed to evolve. Simply allowing your staff to access the file repositories on your network from home as they would in the office, could leave you open to being infected should their laptop or home PC be carrying a virus. It is just not practical to rely on the IT department to secure the myriad of bring your own device (BYoD) personal end-points that remote workers use, therefore, other solutions have become necessary. Many organisations have started providing specific remote desktop solutions via the cloud. These send a live snapshot of your office desktop to a remote device. If a user clicks or types, these interactions are reflected on the server. 


Spy Wars: How nation-state backed threat actors steal from and copy each other

In a detailed review of the opportunities for such attacks, GReAT researchers identified two main approaches: passive and active. Passive attacks involve intercepting other groups’ data in transit, for example as it moves between victims and command and control servers – and are almost impossible to detect. The active approach involves infiltrating another threat actor’s malicious infrastructure. There is a greater risk of detection in the active approach, but it also offers more benefits as it allows the attacker to extract information on a regular basis, monitor its target and their victims, and potentially even insert its own implants or mount attacks in the name of its victim. The success of active attacks relies heavily on the target making mistakes in operational security.


Cyber Insurance - The Third Wave Is Here

Photo published for Cyber insurance – the third wave is here - Accenture Insurance Blog
According to Allianz, premium income from cyber security will rise to $20 billion by 2022. That’s a big jump from researchers’ current estimate of $3 billion gross written premiums. But cyber security is not something insurers want to attempt on their own—they will need to acquire, grow or partner for the advanced cyber security skills and capabilities ... The first wave got a tepid response because businesses were oblivious to the threat. The second fell flat as businesses chose to invest in security infrastructure rather than cyber insurance. Today, however, high profile breaches have made both insurance and business leaders keenly aware of the potential business and financial impacts. The magnitude of the threat is being emphasized by the emergence of new industry and government regulations around cyber security. However, cyber insurance is not an easy offering to design.


10 Things You Believe That Are Holding You Back From Success

In Silicon Valley, failing has turned into a badge of honor. The more you fail, the more you are praised. It's kind of like an unusual victory where no one actually wins. This mentality isn't contained to one region though. It has spread throughout the entire tech industry. Bradley says, "Failing fast is a joke. Anybody who says they want to fail fast or fail slow doesn't understand it. You don't want to fail fast. You don't want to fail slow. You don't want to fail." I've had my fair share of failures in the past, and I'm sure you have as well. If you have truly experienced failure, you would know that the last thing anyone would ever want to do is fail. So what's the alternative? "To get ahead, you want to accelerate and improve your rate of learning on everything that you do," Bradley says. 


Firms look to security analytics to keep pace with cyber threats


Monitoring and threat detection are crucial if businesses are to stay ahead of the curve, and security analytics has emerged as a popular way to counter attacks. It involves the collection, aggregation and analysis of security data, usually combining datasets with sophisticated detection algorithms. Security analytics is extremely diverse, and there is a plethora of ways to collect data, including software, cloud resources, external threat intelligence sources and network traffic. According to a report from Markets & Markets, the industry was worth $2.83bn in 2016 and will reach $9.8bn by 2021. But does the sector live up to the hype? As a market, security analytics is packed with suppliers. Ranging from corporate diehards to fast-growing startups, the industry abounds with choice.


The rules of doing business have changed, say tech entrepreneur

“Entrepreneurial businesses have a much closer understanding than multinationals of how the rules of doing business have changed. This awareness combined with their size and agility has enabled many of them to adjust to the uncertain market conditions. The research shows that tech entrepreneurs are acutely aware of the need to react to changes in market demand and make business decisions more quickly. For many, the ability to lay their hands on reliable and up-to-date data in a format that is tailored to the operational needs of the business is now critical.” With more changes on the agenda, tech entrepreneurs know they could be facing diverse risks in the year ahead. The top three risk factors identified were cash flow difficulties; the potential impact on geopolitical uncertainty on trading activity and a general lack of resources – in particular, a shortage of senior management time.


Microsoft defends sluggish ramp up of extensions for Edge browser

microsoft edge browser resized
The sluggish progress in building an Edge add-ons e-mart was, argued Williams, the result of Microsoft's "purposefully metered approach as we onboard new extensions," as well as the company's "high bar for quality." Additionally, said Williams, Microsoft was taking its time approving add-ons because poorly crafted extensions could weaken the security of the browser, impede its performance and degrade its reliability. Her explanations — depending on one's cynicism, they could be called excuses — were in sync with other statements Microsoft has made over the last two to three years. "Our experience over the past 20 years has taught us that poorly written or even malicious add-ons were a huge source of security, reliability and performance issues for browsers," wrote Drew DeBruyne, a general manager in the Edge group


Can't Wait for JNBridge's Java.VS? Program Java in VS Code Now with Red Hat Tool

As we noted earlier, Java support in Visual Studio is skimpy, leading many developers to turn to the "Language Support for Java by Red Hat" extension in the Visual Studio Code marketplace. Although it's still listed as a preview, Red Hat yesterday noted the tool was first announced a year ago and has reached the 1 million download threshold. "During this past year, we made 10 releases, added various features, fixed many bugs but more importantly, we have constantly grown our user base and finally reached and passed a million downloads on the Visual Studio Marketplace," Gorkem Ercan said in yesterday's post. When announcing the extension in September of last year, Gorkem said the initial release sported a modest feature set


IoT can learn from smartphone security

IoT can learn from smartphone security
Upping the security of IoT/EoT is not that hard. Most current-generation IoT devices are built using commodity microcontrollers that are not inherently designed with the security components built into mobile phone-derived SoCs (e.g., trusted execution “vaults,” encryption engines, VPNs). With a long history of security enhancements over the years and a compelling need to compete in security features, it’s much more appropriate to utilize “downsized” mobile SoC technology than trying to reinvent security in software on chips not inherently designed with the same number of security subsystems. As a result, downsized mobile SoCs have both a competitive advantage in system capability, even if they may cost somewhat more, and have already-proven security features that microcontrollers generally can’t match.


Leaderships’ evolving role in cybersecurity

Risk management is a standard part of doing business. Organizations are already adept at tolerating and mitigating such problems as shrinkage, downtime, turnover and waste. These are treated not as threats to the business, but as costs to be managed and avoided. Why not take the same approach to cybersecurity? CEO silence damages the security posture of any organization. When top executives talk, things happen, but as long as cybersecurity is delegated to a subgroup of the IT organization, people will believe that it’s someone else’s problem. That’s a shame, because the vast majority of breaches can be prevented with a few basic practices: Choose strong passwords, don’t click on unknown links, keep up-to-date with patches and antivirus definitions and protect devices with authentication. Most business professionals are aware of these facts, yet surprisingly few observe them.



Quote for the day:


"Ineffective leaders don't react to problems, they respond to problems and learn." -- Danny Cox


Daily Tech Digest - October 04, 2017

Faster time to market is a competitive necessity, and this pressure isn’t disappearing any time soon. The speed of business metabolism needs to increase. When banks noticed that there was no point fighting against FinTech startups, they collaborated with them. Today, many large banks are pumping millions of dollars into startups. So how can organizations gear up to the speed they need to win in this new, bold world? You don’t have to figure it out all at once. You just have to be willing to start and to make the organizational changes required for success. It’s far more desirable to be successful at a series of smaller tasks than to fail spectacularly upon attempting to tackle a transformation initiative as a big-bang effort. Set the direction, draw a line in the sand, and understand that line will move. There is nothing wrong with taking time early on to prepare to speed up later.


Gartner Identifies the Top 10 Strategic Technology Trends for 2018

"Gartner's top 10 strategic technology trends for 2018 tie into the Intelligent Digital Mesh. The intelligent digital mesh is a foundation for future digital business and ecosystems," said David Cearley, vice president and Gartner Fellow. "IT leaders must factor these technology trends into their innovation strategies or risk losing ground to those that do." The first three strategic technology trends explore how artificial intelligence (AI) and machine learning are seeping into virtually everything and represent a major battleground for technology providers over the next five years. The next four trends focus on blending the digital and physical worlds to create an immersive, digitally enhanced environment. The last three refer to exploiting connections between an expanding set of people and businesses, as well as devices, content and services to deliver digital business outcomes.


How AI is transforming the hiring process

How AI is transforming the hiring process
Thanks to AI for making this task simpler, easier, and more efficient through predictive algorithms and data tools. Guaranteed with precision and perfection, HR software like ATS, Entelo, HireVue, Beamery, JobBot, and Mya are fostering quality hiring in tech-savvy enterprises. They all are superheroes in their respective fields. If Entelo and Beamery search and check the profiles of the potential candidates on social media to find out applicants planning to switch jobs, then HireVue is very useful for assessing the personality of applicants on various facial and verbal parameters in video interviews. Whereas Mya and JobBot are interactive software that communicates with applicants on chatbots and based on these virtual communications they do the ranking of the candidates and help to schedule interviews with the staff of the concerned department.


From big data to smart data, processes and outcomes

Even if big data is essential and we have so many data sources at our disposal, smart data is not about volume and not about technology alone. It’s about usefulness, with multiple layers of intelligence built in the way we acquire, process, analyze, store, interpret and improve data to act upon them and effectively make them useful. Think about paper sources and Intelligent Document Recognition, for instance. Or about how unstructured data are optimized, routed and turned into insights and flows, using artificial intelligence and intelligent information management. With fast changing business dynamics, the speed at which data are acted upon has become essential in a real-time economy too. In that context, there is fast data, to align with those fast moving dynamics and an increasingly demanding customer.


Amazon's latest Alexa devices ready to extend company's reach into your home

Light Gray Echo, Nightstand
The Echo has been Amazon’s surprise hit in the three years since it launched, finding its way into tens of millions of kitchens around the world, offering internet radio, timers, weather and news reports and voice calls. Now Amazon will start selling a smaller, cheaper version of the original Echo, with fabric and wood veneers, as well a new flagship device called the Echo Plus that promises to work instantly with dozens of smart home devices, such as locks, lights and electric sockets. “Setting up your smart home is still just too hard,” Limp said. “It can take up to 15 steps to do something as simple as set up a lightbulb.” Amazon’s vision is of homes with Echo devices in every room, listening to every word you say. A new Fire TV media device adds voice control over streaming TV and movie services like Amazon’s own Prime video, Netflix and others.


Citizen Data Scientists: Why Data Scientists aren't Enough

As Gartner succinctly defined, a citizen data scientist is a person whose primary job function is outside statistics or analytics but nevertheless uses models that are predictive, or have advanced analytics capabilities. Gartner foresees that citizen data scientists will foster greater depth of business analytics, as they provide increased support for (formal) data scientists and enable them to shift their focus to more complex analyses. Many employees who use SQL aren’t computer scientists or engineers, and using SQL is only one part of their job. They have other responsibilities, such as creating spreadsheets, delivering strategy recommendations, executing marketing campaigns. These other duties complement their data skills in a unique way. Imagine if companies replaced these analysts with computer scientists.


The 16 biggest data breaches of the 21st century

This list is based not necessarily on the number of records compromised, but on how much risk or damage the breach caused for companies, insurers and users or account holders. In some cases, passwords and other information were well protected by encryption, so a password reset eliminated the bulk of the risk. ... In September 2016, the once dominant Internet giant, while in negotiations to sell itself to Verizon, announced it had been the victim of the biggest data breach in history, likely by “a state-sponsored actor,” in 2014. The attack compromised the real names, email addresses, dates of birth and telephone numbers of 500 million users. The company said the "vast majority" of the passwords involved had been hashed using the robust bcrypt algorithm. A couple of months later, in December, it buried that earlier record with the disclosure that a breach in 2013, by a different group of hackers had compromised 1 billion accounts.


DeepMind now has an AI ethics research unit. We have a few questions for it

The DeepMind ethics research unit is in addition to an internal ethics board apparently established by DeepMind at the point of the Google acquisition because of the founders’ own concerns about corporate power getting its hands on powerful AI. However the names of the people who sit on that board have never been made public — and are not, apparently, being made public now. Even as DeepMind makes a big show of wanting to research AI ethics and transparency. So you do have to wonder quite how mirrored are the insides of the filter bubbles with which tech giants appear to surround themselves. One thing is becoming amply clear where AI and tech platform power is concerned: Algorithmic automation at scale is having all sorts of unpleasant societal consequences


Ransomware Will Target Backups: 4 Ways to Protect Your Data

The most consistent defense against ransomware continues to be good backups and a well-tested restore process. Companies that consistently back up their data and can quickly detect a ransomware attack should be able to restore their data and operations with a minimum of disruption. In some cases, we have seen wiper malware such as NotPetya pretending to be Petya ransomware while serving a similar ransom note. In these attacks, the victims won't be able to get their files back even they pay the ransom — making the ability to restore from a backup even more critical. ... The strategy is straightforward: Encrypt the backup and individuals or companies are likely to lose the ability to restore data and are more likely to pay a ransom. Attackers are escalating their efforts beyond infecting single workstations and aim to destroy the backups, too.


IT automation trends point to smarter machines -- not just faster

Remediation should be a capability in any IT automation platform. It can take the form of rollback, wherein the platform identifies a problem that it cannot deal with directly and therefore returns the whole system or any specific parts of it back to a working, known state. The rollback process then alerts people, who use logs and other data to identify what caused the problem and take actions to fix it. ... Remediation can alternatively occur when the automation tool identifies why a system did not take the patch and changes it so that it can. This kind of remediation should be preemptive: An IT automation platform should examine all the target systems before attempting to roll out the patch and single out which ones cannot take it.



Quote for the day:


"All the technology in the world will never replace a positive attitude." -- Harvey Mackay