Daily Tech Digest - September 29, 2019

AI used for first time in job interviews in UK to find best applicants

Candidates are ranked on a scale of one to 100 against the database of traits of previous “successful” candidates, with the process taking days rather than weeks or months, says the company. It claims one firm had a 15 per cent uplift in sales. “I would much prefer having my first screening with an algorithm that treats me fairly rather than one that depends on how tired the recruiter is that day,” said Mr Larsen. Griff Ferris, Legal and Policy Officer for Big Brother Watch, said: "Using a faceless artificial intelligence system to conduct tens of thousands of interviews has really chilling implications for jobseekers. "This algorithm will be attempting to recognise and measure the extreme complexities of human speech, body language and expression, which will inevitably have a detrimental effect on unconventional applicants. "As with many of these systems, unless the algorithm has been trained on an extremely diverse dataset there's a very high likelihood that it may be biased in some way, resulting in candidates from certain backgrounds being unfairly excluded and discriminated against."

Traditional banks are struggling to stave off the fintech revolution

Traditional banks are struggling to stave off the fintech revolution
The other blind spot for legacy banks is their tendency to have a narrow and misguided understanding of disruptive business models. This usually begins with treating a new species of competitors as traditional ones. For example Cathy Bessant, Bank of America's CTO, commented on Apple's announcement of a new credit card: "My reaction when I saw the announcement was, first competitively, all of the features that are in that card are offerings we have today." The propensity to see only the product or service and not the entire business model is common among incumbents across a range of industries. Kodak, Blockbuster and Nokia were only three of the hundreds of disrupted incumbents which were able to see only the product (and associated features) that threatened them and not how the business models of their competitors allowed the creation of entirely new ecosystems that they were poorly equipped to survive in. By stooping down to competing on a feature by feature basis, incumbents lose the chance to redefine an industry that they once dominated.

Arizona getting help developing cybersecurity professionals

From the global to the local, cybersecurity breaches affect us in nearly every aspect of our lives. Hackers don’t discriminate. They attack small businesses and multinational corporations, federal agencies and local school districts, the young and old, the rich and poor. Many people have called the internet the modern battlefield and cybersecurity professionals the warriors of the digital age. Getting better at protecting ourselves, our businesses, our citizens and our communities against cyber threats will be one of the defining challenges of the next decade — and something we absolutely have to get right. The chief reason cyber attacks are increasing in number, scope, sophistication and damage is it is really hard to get ahead of the hackers. Cybersecurity in 2019 and beyond requires a very different approach than we’re used to. And that requires a very different kind of cybersecurity professional. The problem is there are far more job openings in cybersecurity than qualified candidates to fill them.

Venture Capital 2.0: This Time It's Different?

We’re starting to see some rationality about this creeping in around the edges. Take Uber, whose theory of success (at least for now) is that it will dominate local markets for both drivers and riders eventually. If you believe that, then it’s worth subsidizing both sides with venture money. Uber may well be Exhibit A of the mythical first-mover advantage illusion. In just three months, Uber lost over $5 billion. The real problem here is one that we’ve seen before—to seed a market, a startup subsidizes early customers. The theory is that once you have them in the door, you can eventually create pricing power and raise prices. Eventually, unless you have some other revenue stream like darkly trading in people’s personal information, you have to charge enough to cover the cost of the service and make a profit. Once those $7 Uber rides start costing $30, riders will be back in their own cars or on the bus.  Another “what were they thinking?” example? E-cigarette maker Juul.

The CIO’s role in driving agile transformation

tunnel highway driving car roadmap
Some CIOs channel solutions to what their internal teams are skilled and have the technologies to implement on their own. Others look to outsource more and seek partners or system integrators to oversee implementation. And some CIOs gripe when business leaders have already selected partners or when the CIO is asked to assist or bail out shadow IT. None of these are optimal, and innovative solutions delivered faster and with higher quality more often requires a blend of internal resources, partners, reuse of existing platforms, and experimentation with new technologies.  CIOs should partner with their business leaders on developing an ecosystem of partners and technologies that drive current and future needs. This is not a procurement process nor is it a vendor due-diligence process as both of these assume requirements are known and one ore more vendors already in consideration. This is an exploration, and innovative, digitally minded CIOs are best equipped to define and manage this journey.

HPE Extends Its Cybersecurity Capabilities And Earns Two Cyber Catalyst Designations

Understanding that no cyber resilience solution is complete without the capability to recover from a cyber incident, HPE followed up its delivery of Silicon Root of Trust with its Server System Restore capability, built into iLO 5 amplifier pack. This capability enables organizations to restore servers to its original operating environment. MI&S detailed these capabilities here. HPE continues to deliver on its cyber resilience with two new features that further put the company in a leadership position. One of the newer features that hasn’t been covered too much is called One-Button Secure Erase. This feature is exactly what it implies - the ability to completely erase every byte of data that sits on an HPE server when an IT Department decides to end-of-life infrastructure. When that old server is ready to be recycled or donated, IT organizations can have confidence there will be no traces of data or proprietary information. This is an invaluable feature for organizations of all sizes.

Chatbot: The intelligent banking assistant

chatbot platform, chatbot interface
With chatbots gaining more traction, many firms across the globe have started offering off-the-shelf products that help developers to build, test, host and deploy these programs using Artificial Intelligence Markup Language (AIML), an open source specification for creating chatbots3. A few platforms support integration with payment providers for seamless processing of customer payments based on a customer’s interaction with the bot. Increasingly, chatbots are also attracting interest in the world of FinTech, and a number of companies have developed their own chatbots using proprietary technology and algorithms. Chatbots utilise application programming interfaces (APIs) to integrate with data management platforms. This allows them to analyse the extracted data as well as web- and mobile-based user interfaces and deliver the necessary insights to the end customer. ... In their current form, chatbots have reached a certain level of maturity.

CIOs Should Be Asking Questions In The Boardroom, Not Just Answering Them.

In the boardroom
“A company with a clear vision of the future is more likely to win by either setting the rules of the game or being quick to take advantage of an unfolding new industry landscape defined by other players.” The CIO can catalyze a board to “look for gaps; reframe closed mindsets; provide external perspective; and point to potentially better options or directions. “Executive teams, no matter how effective at current operations, can often become myopic. A (CIO’s) big, well-aimed, simple question can disrupt such complacency,” he says. But, before this can even begin to happen, there remains the non-trivial matter of achieving board appointment for a technologist in the first place.  CIO or CTO NED board appointment is a needle that is hard to move in a boardroom culture dominated by finance and general management. To move it, Gartner’s formula is to invite board candidates with technology backgrounds to a series of dinners, also attended by major recruiting firms and board chairmen.

Dear network operators, please use the existing tools to fix security

It's tempting to point the finger at network operators for failing to deploy RPKI. But another finger needs to be pointed at the software vendors for providing shoddy documentation. Routing security isn't the only system where deploying existing tools can make a big difference. Huston said in 2017 that failing to secure the DNS with DNSSEC is savage ignorance. Network operators should get onto that before fingers are pointed at them. Network operators should also avoid being the recipient of pointing fingers by deploying DMARC message authentication to prevent spammers from spoofing their domains for email. The UK's National Cyber Security Centre (NCSC) has used DMARC to significantly reduce that risk for government domains. "That's how you stop people clicking on the link, because they never get the crap in the first place. Simple things done at scale can have a difference," said Dr Ian Levy, the NCSC's technical director in 2018. The Australian government has also been deploying DMARC on its domains, though its efforts have lagged behind the UK.

Postgres Handles More Than You Think

Thinking about scaling beyond your Postgres cluster and adding another data store like Redis or Elasticsearch? Before adopting a more complex infrastructure, take a minute and think again. It’s quite possible to get more out of an existing Postgres database. It can scale for heavy loads and offers powerful features which are not obvious at first sight. For example, its possible to enable in-memory caching, text search, specialized indexing, and key-value storage. ... Postgres provides a powerful server-side function environment in multiple programming languages. Try to pre-process as much data as you can on the Postgres server with server-side functions. That way, you can cut down on the latency that comes from passing too much data back and forth between your application servers and your database. This approach is particularly useful for large aggregations and joins. What’s even better is your development team can use its existing skill set for writing Postgres code. Other than the default PL/pgSQL (Postgres’ native procedural language), Postgres functions and triggers can be written in PL/Python, PL/Perl, PL/V8 (JavaScript extension for Postgres) and PL/R.

Quote for the day:

"Give whatever you are doing and whoever you are with the gift of your attention." -- Jim Rohn

Daily Tech Digest - September 28, 2019

5G and IoT: How to Approach the Security Implications

The first thing is an IoT bot. The botnet nodes, they actually spend most of their time scanning the network looking for other victims. That’s their primary, the primary thing that they do. And because of that these botnets naturally increase in size over time. Eventually once they’ve covered all the devices available, again, the botnet sizes are sort of self-limiting. And that’s a thing to bear in mind when we start talking about the 5G thing. Because in the future with 5G, the number of IoT devices is going to increase exponentially and so the size of these potential botnets is going to be quite, quite incredible. That’s one thing to bear in mind. When an IoT bot finds a new victim, it responds back to its command-and-control server. And then they go ahead and infect that new device that’s been detected. And that device will then become a member of the bot. And the botnet gets larger and it continues to scan. One of the key things here is that in order to be infected, the device has to be visible from the internet, visible from the existing botnet members.
Much like any disruptive technology, blockchain has a diversity problem which further limits accessibility. For the most part, blockchain expertise is confined to the financial and technological industries and the affluent white men that dominate them. Services from Amazon, IBM, Microsoft and Oracle may bolster blockchain use, but they don’t solve this fundamental issue. Tech education startup Maiden aims to make blockchain more accessible by teaching members of traditionally underrepresented groups about transactions, smart contracts, and other applications of the technology. Ultimately, if blockchain products are created by groups that genuinely represent society, they will impact more people and break down educational barriers. Big businesses with tech expertise are making it possible for more organisations to benefit from blockchain with hosted platforms and BaaS. However, without more effort given to education and understanding, companies will continue to shy away from distributed ledger technology.

More Data Doesn’t Guarantee That Analytics Will Deliver Digital Transformation

A man holding a laptop and woman pointing to a digital screen
We often overlook the presence of disconnected and fragmented data silos – making it impossible to paint a complete picture of the business because different segments linger in detached states or isolated buckets. Left disintegrated, these data buckets rust in data warehouses and lakes – unless they evolve into cohesive and compatible building blocks that form the foundation of an intelligent enterprise. ... Having more data doesn’t do much good if we aren’t asking the right business questions or don’t understand the assumptions behind them. Through critical thinking, we need to carefully examine evidence based on what’s relevant to the question before reaching any conclusions or making any decisions. That starts by asking questions, which is a prerequisite for asking the right questions. The process of creating value with data begins and ends with business leaders who promote a culture of data-driven decision-making. When it’s absent, we lose direction and guidance and cannot make a significant impact.

GDPR: Only one in three businesses are compliant – here's what is holding them back

"For many organisations, the true size of the GDPR challenge only became apparent as they began the initial projects to identify the applicable data that they held. As a result, only the most focused organisations had completed their GDPR readiness by the time the legislation came into force," Chris Cooper, head of cybersecurity practice at Capgemini, told ZDNet. Businesses that aren't yet compliant with privacy legislation point to a number of obstacles that prevent them from being so. Chief among those is legacy IT systems, with 38% of those surveyed suggesting that their current IT landscape isn't aligned to the complexities of GDPR. Meanwhile, 36% believe the requirements of GDPR are too complex and require a lot of general effort to implement, while one third of respondents say that the financial costs of achieving alignment with GDPR are too prohibitive. Not only are businesses that remain non-compliant putting themselves at risk of falling victim to a data breach and the financial and reputational damage that could create – alongside the financial cost of a regulator fine – they're also holding themselves back from the benefits that compliance can bring.

New SIM card attack disclosed, similar to Simjacker

SIM card
This new attack, named WIBattack, is identical to Simjacker, an attack disclosed at the start of the month by mobile security firm AdaptiveMobile. Both attacks work in the same way, and they grant access to similar commands, with the exception that they target different apps running on the SIM cards. Mainly, Simjacker runs commands against the S@T Browser app, while WIBattack sends commands to the Wireless Internet Browser (WIB) app. Both are Java applets that mobile telcos install on SIM cards they provide to their customers. The purpose of these apps is to allow remote management for customer devices and their mobile subscriptions. In a report released earlier this month, AdaptiveMobile said it discovered that a "private company that works with governments" was using rogue commands sent to S@T Browser apps running on SIM cards to track individuals. In a report published last weekend, security researchers from Ginno Security Labs said that the WIB app was also vulnerable to similar attacks, although they were not aware of any attacks.

10 principles of workforce transformation

Many business leaders realize that they can’t just hire the workforce they need. There aren’t enough prospective recruits, and the expense would be enormous. Instead, companies must upskill their existing employees or members of their communities. This means expanding people’s capabilities and employability, often using adult learning and training tools, to fulfill the talent needs of a rapidly changing economy. Upskilling is part of the answer. But you also need to rethink your jobs: redesign the workflow, combine some positions, add others, and probably eliminate some. You need to be more creative in finding and onboarding people, including through acquisitions, partnerships, gig economy–style freelancing arrangements, and talent pools oriented to flex work. Finally, you must fill your enterprise with opportunities for continual self-renewal via modern learning strategies and digital technologies, so that becoming adept in new technologies is just part of everyday life.

AI And The Evolutionary Commoditisation Of RPA

Artificial Intelligences’s evolutionary path is actually very different. Although it’s been around in various forms since the 1950s, we are still very early in the journey, but with the technology developing at an exponential rate. What we have now is the perfect storm of ubiquitous data (which AI feeds off), storage costs for all this data that is so cheap that they almost become irrelevant, the processing power to run complex models in minutes rather than days, and everything connected together(including access to publicly available data training sets). AI is ready to really lift off. But before we get carried away and start to imagine sentient machines that will take over the world, we need to remember that everything that AI does is very narrow. That means that each AI model can do one thing, and one thing only, very well. An AI trained to recognise pictures of dogs can’t read text. It can’t even be used to recognise pictures of cats – the system would need to be completely wiped and retrained using cat pictures instead of dog pictures.

The IT Pyramid of Pain: how IBM’s CIO Fletcher Previn retains top talent

For many organisations, digital transformation has shifted the function of IT from being solely a service provider to a business driver. On these grounds, Fletcher encourages other IT departments to get more involved in the cultural aspects of their organisation. He said: “The culture of any work environment is largely a function of how work gets done. That, in turn, means that the tooling and IT surrounding the employees is not trivial – it’s core to any strategy for creating a high-performance workforce. “In order to create an environment where talented people want to work, and in particular, where gifted engineers want to work, I have to provide a productive environment for our people. “Also important is building out a modern DevOps software development stack, and enabling employees with the best tools available. Our general approach to this is: give people the right tools and equipment, manage those assets in a modern way, and enable self-service in the environment.”

10 Ways AI And Machine Learning Are Improving Endpoint Security

10 Ways AI and Machine Learning Are Improving Endpoint Security
AI and machine learning are proving to be effective technologies for battling increasingly automated, well-orchestrated cyberattacks and breach attempts. Attackers are combining AI, machine learning, bots, and new social engineering techniques to thwart endpoint security controls and gain access to enterprise systems with an intensity never seen before. It’s becoming so prevalent that Gartner predicts that more than 85% of successful attacks against modern enterprise user endpoints will exploit configuration and user errors by 2025. Cloud platforms are enabling AI and machine learning-based endpoint security control applications to be more adaptive to the proliferating types of endpoints and corresponding threats. ... Combining supervised and unsupervised machine learning to fine-tune risk scores in milliseconds is reducing fraud, thwarting breach attempts that attempt to use privileged access credentials, and securing every identity on an organizations’ network. Supervised machine learning models rely on historical data to find patterns not discernable with rules or predictive analytics.

The best API strategy is not to start with an API strategy

Business requirements first -- APIs later, says David Berlind, editor of ProgrammableWeb, in his keynote presentation at the recent MuleSoft Connect event in New York.  "The APIs come at the very end," he says. Every effort should start with customer experience and business strategy, he explains. "You don't start with an API strategy. You tart with a business strategy and customer experience. Then you figure out what APIs need to be in place, so information can be exchanged between you and your partners. Then you think about the design of those APIs, the technical specifics and tactical stuff. Then you have an API strategy, and an ecosystem." There's been plenty of talk that the "i" in CIO or IT should stand for "innovation." However, Berlind believes "imagination" is more appropriate. "For decades now, we've been struggling to keep the lights on in IT, reduce costs, do more with less. In my view, its time to rethink that process. Get the organization to understand the power of the API, and how it could be such a game-changer to whatever industry you're in." ... "It's important to get everybody in the organization aware of where the APIs are. But it's also equally important to make sure the entire organization understands the power of the APIs, and how it allows them to imagine different outcomes -- outcomes that were quite unimaginable just a few years ago. ..."

Quote for the day:

"Valor in the leader is often an expression of the leader's character, fortitude, grace, vulnerability, openness, and honesty." -- Catherine Robinson

Daily Tech Digest - September 26, 2019

Social engineering explained: How criminals exploit human behavior

Social media threat / danger / risk >  Text bubbles interact, one bearing skull + crossbones
Social engineering has proven to be a very successful way for a criminal to "get inside" your organization. Once a social engineer has a trusted employee's password, he can simply log in and snoop around for sensitive data. With an access card or code in order to physically get inside a facility, the criminal can access data, steal assets or even harm people. In the article Anatomy of a Hack a penetration tester walks through how he used current events, public information available on social network sites, and a $4 Cisco shirt he purchased at a thrift store to prepare for his illegal entry. The shirt helped him convince building reception and other employees that he was a Cisco employee on a technical support visit. Once inside, he was able to give his other team members illegal entry as well. He also managed to drop several malware-laden USBs and hack into the company's network, all within sight of other employees. You don't need to go thrift store shopping to pull off a social engineering attack, though.

Why you should hire staff from firms that have fallen victim to hackers

Being equipped with the experience of having been through it before can provide benefits not only for setting up systems to prevent damaging attacks, but the processes required if an organisation does fall victim to hackers. Rather than viewing staff who've worked at organisations that have suffered a cyberattack as having failed to do their job, other organisations should be actively seeking out these people to learn from them – even to the extent of hiring them for their own security teams. "Senior members of security staff who've worked in organisations which have had a major, publicised breach, that can be seen as a negative – somehow individuals can be tarnished with that. That's probably the exact opposite to the way to how the industry should be thinking," Darren Thomson, CTO EMEA at Symantec, told ZDNet. "Someone who has lived through one of these incidents and been through the whole process, recovering from the bad experience then implementing additional security and privacy measures: that knowledge and experience is valuable and it's good to have someone with it," he added.

When to use AWS OpsWorks vs. CloudFormation or Elastic Beanstalk

With AWS OpsWorks, developers can deploy Puppet or Chef to manage declarative configurations within EC2 instances. Like CloudFormation, you can use OpsWorks to deploy AWS resources. However, OpsWorks automates the initial deployment of applications, as well as the ongoing changes to the operating system and application infrastructure. Both Puppet and Chef can also control the deployment of AWS infrastructure. You should use OpsWorks in place of CloudFormation if you need to deploy an application that requires updates to its EC2 instances. If your application uses a lot of AWS resources and services, including EC2, use a combination of CloudFormation and OpsWorks. IT teams can integrate CloudFormation with OpsWorks to configure newly deployed EC2 instances with Chef or Puppet, rather than simple shell scripting.

New Relic CEO Lew Cirne on Observability in Development

A large enterprise might have 2,000 applications. Some of those applications are cloud native and many of those may be actively worked on right now. For those applications, they may be choosing to manually instrument them to not only be functional but also observable. When they do that, they use open APIs. That data would go to some other tool. Then they’ve got some applications they don’t have time to instrument and they want to see in production -- they drop our agent in. They go to New Relic for some of their stuff and go to other tooling for other needs. Now they can have it all in one place. “That open telemetry is a big change for us. For people familiar with New Relic it’s a new way to look at us. The second part of that is with all of this telemetry data coming into one place, we’ve believed for some time that dashboards are not enough. If you look at why people love our APM [application performance management] product, for example, it’s more than a dashboard. It’s an interactive application that understands the telemetry data we collect and presents it to our customers in a useful way.

Enterprises tap edge computing for IoT analytics

Industry 4.0 / Industrial IoT / Smart Factory
Putting analytics, servers and storage together at the edge to process data from the cameras and IoT sensors on the equipment eliminates the need “to send command and control to the cloud or a centralized data center,” which can take 40 milliseconds to get from one spot to another, Pugh says. “That’s too long to interpret the data and then do something about it without impacting production.” That type of decision-making needs to happen in real time, he says. Edge computing can be taxing on an IT department, though, with resources distributed across sites. In SugarCreek’s case, six manufacturing plants span the Midwest U.S. SugarCreek plans to move from its internally managed Lenovo edge-computing infrastructure to the recently launched VMware Cloud on Dell EMC managed service. SugarCreek beta-tested the service for Dell EMC and VMware when it was code-named Project Dimension. SugarCreek already uses edge computing for local access to file and print services and Microsoft Active Directory; to store video from indoor and outdoor surveillance cameras; and to aggregate temperature and humidity sensors to assess how well a machine is running.

75% of execs cite phishing as the most significant security threat to businesses

"Organizations worldwide are realizing the need to invest in employee training and deploy different security awareness training solutions with the hope of mitigating the risk of data breaches," Gian said. "The problem is that many organizations settle for dated phishing simulation solutions that train employees randomly and require manual effort to operate. The outcome is disappointing, employee behavior doesn't change and information security teams remain powerless and frustrated in the face of successful phishing attacks. Effective training should not become an IT and financial burden, but be done autonomously, via data science driven methodology that offers each employee a customized, continuous training every single month and significantly changes employee behavior, hence mitigates organizational risk of cyber-attacks. "Just like the right technology," Osterman said, "such as firewalls or endpoint detection and response solutions, can protect an organization's data and financial assets from theft or destruction, so can the right employee training."

Have a Failing Big Data Project? Try a Dose of AI

Image: Romolo Tavani - stock.adobe.com
AI is a broad category that can include supervised and unsupervised machine learning, neural networks and reinforcement learning. "The key to knowing which of these tools to use is predicated on a detailed understanding of the problem you are trying to solve and the types of data -- structured, semi-structured, unstructured -- with which one has to work," Schmarzo explained. A good data scientist, he noted, is like a skilled carpenter in that both will use the best combinations of tools to solve the problem at hand. AI may not be new, but AI at scale within complex organizations is still in its early stages. "We still do not yet understand every consequence of integrating AI into larger systems," Gallego said. "Organizations should be ready to take on this risk and should be mature enough to understand the consequences and tradeoffs." Heineken noted that all big data projects, regardless of the approach used, have three basic failure points: understanding the question that needs to be answered, the data architecture and its availability, and having the ability to land insights into a business workflow at scale. Effectively addressing these issues "are all critical success factors," he advised.

Russian pleads guilty in massive JPMorgan hacking scheme

According to the indictment unsealed at the time, Shalon was the mastermind of the whole operation, which prosecutors dubbed “hacking as a business model.” Shalon was the owner of US-based Bitcoin exchange Coin.mx, which he operated with Orenstein. Both are Israelis. With the help of Aaron, an American, the group allegedly bought up the type of penny stocks so often used in pump-and-dump scams. Then, using the customer data allegedly stolen from JPMorgan, Dow Jones, Scottrade and others, they blasted out emails to dupe the financial organizations’ customers and subscribers into buying the junk. It worked like a charm: they allegedly pocketed $2m from one deal alone. Prosecutors said the scheme generated “tens of millions of dollars in unlawful proceeds.” According to Monday’s indictment, Tyurin took his marching orders from Shalon. The New York Times reports that Tyurin’s lawyer, Florian Miedel, said in a statement that his client was “hired by the originators and brains of the scheme to infiltrate vulnerable computer systems at their direction.”

Data Security in the Age of Online Payments and Social Media Validation

Data Security in the Age of Online Payments and Social Media Validation: Where Does the Buck Stop?
Amidst threats that are looming large, it is important to guard against descending into a spiral of pessimism and hate. Finding the objective middle ground between abandonment of technology and resigning to a total surrender of privacy for instant benefit is the need of the hour. And that begins with the acknowledgement of all the advantages leveraged so far. To put things in perspective, it is necessary to ask three questions integral to this global dilemma. One, where does the buck stop in regards to data security? Two. What is the role of the user in protecting his data and privacy, while continuing to integrate the digital advantage in routine tasks? Is it possible to overcome the trust deficit that is growing by the day? Before looking at the answers, let’s shed light on the evolution of the smart-world that we claim to inhabit. From the days of barter system to paying bills and having food delivered to your doorstep, we have come a long way indeed.

Mind the Gap – a Road to IT/OT Alignment

Few organizations currently manage IT and OT with the same staff and tools. After all, these networks evolved with a different set of priorities and they operate in inherently different environments. Nevertheless, in order to address this new complex threat and to protect this broader attack surface, many industrial organizations have begun to converge their IT and OT groups. The ‘convergence initiative’ is anything but simple. The growing pains associated with bringing together these two substantially different worlds can prove to be a challenge. The IT/OT convergence trend is not only driving integration of IT tools with OT solutions, it also requires alignment of the strategic goals, collaboration and training; and bridging between two departments with people that have different backgrounds, different mindsets and concerns for their departments. In general, IT people are used to working with the latest and greatest hardware and software, including the best security available out there to protect their networks. They tend to spend time patching, upgrading and replacing systems.

Quote for the day:

"Real leadership is being the person others will gladly and confidently follow." -- John C. Maxwell

Daily Tech Digest - September 25, 2019

Digital twins – rise of the digital twin in Industrial IoT and Industry 4.0

The rise of the digital twin in the Internet of Things
Digital twins offer numerous benefits on which we’ll elaborate later. In fact, you might already have seen the concept in action. If you didn’t, the video below, using a bike equipped with sensors, gives you a good idea. However, in real life you’ll notice that digital twins today are predominantly used in the Industrial Internet or Industrial Internet of Things and certainly engineering and manufacturing. If you remember our airplane engine or other complex and technology-intensive physical assets such as IoT-enabled industrial robots and far more, you can imagine why. You can even create a digital twin of a an environment with a set of physical assets, as long as you get those data. ... In the future we’ll see twins expand to more applications, use cases and industries and get combined with more technologies such as speech capabilities, augmented reality for an immersive experience, AI capabilities, more technologies enabling us to look inside the digital twin removing the need to go and check the ‘real’ thing and so on.

AI will be the biggest disruptor in our lifetime: Amitabh Kant, CEO, NITI Aayog

India is among the very few countries globally where the government has driven digitization in a big way. For instance, almost 99.3 percent of Indians pay their Income Tax online. Almost 96 percent of these filings are cleared within three months because they are digital. The new Goods & Service Tax (GST), is digital – cashless and paperless. The Ayushman Bharat scheme is portable, paperless and digital. It provides health insurance to 500 million Indians. The number of beneficiaries is greater than the population of the USA, Europe, and Mexico put together. Every single rupee released through the Public Finance Management System (PFMS) is tracked to the last point digitally. By integrating technology into various aspects of the economy, the government has generated vast volumes of datasets. It is important that we use this data along with computing power and new algorithms to drive huge disruption. That’s the only way we can radically leapfrog and catch up with advanced economies.

European enterprises 'waste' £24,000 a day on unused cloud services, says Insight research

Given the foundational role that cloud is increasingly playing within enterprise digital transformation strategies, these are important areas to address and get right, the report continues, as organisations set about making better use of their data through the deployment of analytics, machine learning and artificial intelligence tools. Indeed, 46% of respondents flagged AI, big data, machine learning and deep learning tools as being “critical” to their digital transformation initiatives over the past two years. “When analysed, shared, and leveraged intelligently, [data] can facilitate more informed decision-making, improve the quality of offerings, and enhance the customer experience,” the report said. “IT professionals express confidence in AI, big data and machine learning because these technologies enable organisations to transform data into business intelligence.” But this confidence could prove to be misplaced unless organisations have a robust cloud strategy in place to underpin their plans, the report added.

There is a real demand for AI in healthcare, but preserving privacy is key

There is a real demand for AI in healthcare, but preserving privacy is key image
The introduction of AI into healthcare is important for several reasons. The main one, though? Scale. “With the NHS potentially losing up to 350,000 staff by 2030, using AI will be the only way to scale services to match the mounting demand that is hitting the UK with a shrinking workforce,” explains Lorica. The impact of AI in healthcare won’t only be felt in the NHS. Instead, the technology  will have a wide range of applications in everything from personal medicine to research, diagnosis and logistics. But, despite a clear desire to integrate AI, it must be done correctly. And before it can effectively disrupt the sector, Lorica suggests that “various organisational and cultural changes need to be implemented. ... Before AI can truly transform the healthcare sector, the elephant in the room needs to be addressed: patient confidentiality or privacy. “The NHS holds personal information about almost every person living in the country, which means preserving privacy, collecting and cleaning data and data sharing is paramount,” explains Lorica.

The Interesting Case of Who’s Using the IT4IT™ Standard – Part Two

Digitalization is driving the proliferation of Cloud and Mobility and is causing IT organizations to rethink their IT Operating Model to support both the digital workforce and new service delivery models. To exploit the rapid pace of disruptive IT innovation, HCL Technologies chose to adopt The Open Group IT4IT Reference Architecture to design and develop its XaaS-based (Everything as a Service) product and service offering. This meant that HCL Global IT needed to better understand the business requirements of IT to allow it to achieve the agility and velocity the business and end users required of its services. To achieve this, HCL Global IT required a unified and sophisticated IT Operating Model to support the business in their Digital Transformation journey. Therefore, HCL aligned its product and services to the IT4IT Value Stream-based IT Reference Architecture and developed a product and platform named XaaS Service Management (XSM), which has the ability and capability to address the customer-specific issues and challenges.

BigTech is coming. Is banking ready?

Many banks are responding to the competition from BigTechs (and fintechs) by learning from and co-creating with them to strengthen their client propositions. They are also investing heavily to support new partnerships, acquisitions, and the development of in-house solutions. Upholding this, a recent Bloomberg report that ranked banks by technology spending so far in 2019 showed the top five had invested a combined USD44 billion. Corporates have also been responding to the ‘uberisation’ of commerce following BigTech’s move from online consumer models further into the B2B arena. As well as re-engineering their physical and financial supply chains, corporates are now also rethinking their relationships with transaction banks. For example, as manufacturers try to replicate BigTech’s speed, they are considering decentralised production. Having multiple yet smaller assembly locations puts companies closer to the end-consumer. It also creates a more conducive environment to react to changing local demand and offer more customisation.

How Artificial Intelligence and IoT are transforming real estate

The trend for workplace flexibility also provides an incentive for rental platforms and Space as a Service. The increase in smaller companies (self-employed persons) is resulting in increased demand for flexible and on-demand workplaces. Many corporates are trying to cut on overheads, opting for shared workspaces. As part of this we have developed a software product called Yardi Kube that centres can use to manage members’ space allocations. Yardi Kube folds in a technology management system for shared workspaces, providing IP addresses, Wi-Fi and telephones that are crucial for this sector. Yardi’s coworking module will be released in 2020 across the Middle East. The platform will provide the most comprehensive coworking software on the market as it combines financial, workspace and technology management all in a centralised database. IoT is a technology where systems such as plumbing, electrical outlets, thermostats and lighting are connected and perform smart functions via the internet. From convenient property showing and increased energy efficiency to predictive maintenance, IoT applications are making it easier for people to buy, sell and own rental properties. Smart homes with IoT capabilities usually have a higher market value than those that don’t.

Can Oracle substantiate its cloud bluster?

Commenting on the competition in the cloud, Ellison said the cloud databases were open source-based and a lot more specialised. But he added: “None of them are autonomous. None of them are secure. None of them give you 99.995% availability. I mean, they’re – we’re 100 times more reliable.” But Oracle does face a challenge from these open source competitors. The competition is not coming directly from its previous enterprise customers and CIOs in these organisations. Instead, it is being driven from the bottom up by software developers choosing products they consider more exciting and, arguably, technically superior for the applications that use them, compared to Oracle. A recent Stack Overflow survey of 6,000 developers in the UK and Ireland reported that Oracle was not among the top three database servers being used: MySQL was used by 44.9% of respondents, Microsoft SQL Server by 40.7% and PostgreSQL by 31.4%.

Testing Microservice: Examining the Tradeoffs of Twelve Techniques - Part 2

Most projects need a combination of testing techniques, including test doubles, to reach sufficient test coverage and stable test suites — you can read more about this so-called testing pyramid. You are faster to market with test doubles in place because you test less than you otherwise would have. Costs can grow with complexity. Because you do not need much additional infrastructure or test-doubles knowledge, this technique doesn’t cost much to start with. Costs can grow, however — for example, as you require more test infrastructure to host groups of related microservices that you must test together. Testing a test instance of a dependency reduces the chance of introducing issues in test doubles. Follow the test pyramid to produce a sound development and testing strategy or you risk ending up with big E2E test suites that are costly to maintain and slow to run. Use this technique with caution only after careful consideration of the test pyramid and do not fall into the trap of the inverted testing pyramid.

Google Wins 'Right to Be Forgotten' Case

Google Wins 'Right to Be Forgotten' Case
Google commented on the ruling in a statement: "Since 2014, we've worked hard to implement the right to be forgotten in Europe, and to strike a sensible balance between people's rights of access to information and privacy. It's good to see that the court agreed with our arguments."Europe's General Data Protection Regulation, which went into full effect last year, has a separate "right to be forgotten" provision with much broader requirements. ... While ruling that Google does not have to extend the right to be forgotten for European citizens outside of Europe, the court acknowledged that in today's globalized world, information can harm a person's reputation. But it said different countries have different approaches to the right to be forgotten, and hence a universal law cannot be applied. "A global de-referencing would meet the objective of protection referred to in EU law in full. ... Numerous third states do not recognize the right to de-referencing or have a different approach to that right," the court said.

Quote for the day:

"Great leaders go forward without stopping, remain firm without tiring and remain enthusiastic while growing" -- Reed Markham

Daily Tech Digest - September 24, 2019

Two AMD Epyc processors crush four Intel Xeons in tests

Two AMD Epyc processors crush four Intel Xeons in tests
Tests by the evaluation and testing site ServeTheHome found a server with two AMD Epyc processors can outperform a four-socket Intel system that costs considerably more. If you don’t read ServeTheHome, you should. It’s cut from the same cloth as Tom’s Hardware Guide and AnandTech but with a focus on server hardware, mostly the low end but they throw in some enterprise stuff, as well. ServeTheHome ran tests comparing the AMD Epyc 7742, which has 64 cores and 128 threads, and the Intel Xeon Platinum 8180M with its 28 cores and 56 threads. The dollars, though, show a real difference. Each Epyc 7742 costs $6,950, while each Xeon Platinum 8180M goes for $13,011. So, two Epyc 7742 processors cost you $13,900, and four Xeon Platinum 8180M processors cost $52,044, four times as much as the AMD chips. And that’s just the chips. The actual servers will also set you back a pretty penny, especially since four-socket servers cost much more than two-socket servers regardless of the processor you use.

Build cloud economics expertise in-house

Organizations think the cloud is more expensive than an on-premises data center because they don't capture costs effectively throughout the entire process. That starts during the architecture and design phases, where expenditures often go unnoticed. Put the tools in place to communicate -- at a high level -- all features and services that power your applications and environments. For example, it could take the form of a reference architecture that lays out all the services you have in production. Enterprises are better positioned to spot potential money pits when there's cost transparency, and managers, IT staff and back-office staff understand their roles in keeping costs in check. As you put together plans for your cloud initiative, remember to factor in the costs of such training. Organizations deploy a variety of cloud storage techniques, such as archival, reduced redundancy and backup -- each of which carries its own impacts on performance and cost.

Putting blockchain technology to good use

Richard Hunt, founder of Turnkey Consulting, believes that once an individual has been through the process to prove their identity, this proof can be reused in other situations where ID is required. “A digital identity would enable citizens to take back control of their data and their identity, choosing who to share this information with and, perhaps more importantly, who not to,” he says. “It would also allow individuals to both fully understand and capitalise on the value of their personal data.” Gartner distinguished vice-president David Furlonger says governments are looking at ways blockchain can be deployed to improve efficiency. Efficiency-based initiatives are founded on the idea that decentralised, multiparty transactions can be streamlined using blockchain to solve transactions. Government interests are mostly driven by their need to decrease friction in disconnected processes, interactions or transactions between a variety of government organisations or involving the broader public/private ecosystems.

Microsoft delivers emergency security update for antiquated IE

secure encrypted internet web browser address bar
IE was demoted to second-citizen status with the introduction of Windows 10, but Microsoft has been adamant that it will continue to support the browser. IE, particularly IE11, remains necessary in many enterprises and organizations for running aged web apps and internal websites. The browser may retreat to a "mode" within a vastly reworked Microsoft Edge - and the stand-alone abandoned - but IE will live on in some form. Still, it's no longer the most popular kid on the block: According to the latest data from web analytics vendor Net Applications, IE accounted for just 9% of all Windows-based browsing activity. For comparison, Edge's share of all Windows was around 7%. According to information in the description of the update package, the emergency IE fix is available only through the Microsoft Update Catalog. Users would have to steer a browser to that website, then download and install the update. The easiest way to locate the IE update is by using the link in the OS-appropriate KB (for knowledge base) gleaned from the security bulletin. (No one said Microsoft makes it easy.)

Managers Lack Confidence To Develop Skills Employees Need Today: Gartner

Managers lack confidence to develop skills employees need today: Gartner - CIO&Leader
“Today’s organizations are undergoing a digital transformation that directly impacts how they do business, and they are finding a significant skills gap within their workforces,” said Jaime Roca, senior vice president in the Gartner HR practice. “Our research found that 70% of employees have not mastered the skills they need for their jobs today, let alone the skills needed for their future roles.” Organizations that are most successful at developing their employees have focused on cultivating Connector managers, who are able to connect employees to the right people and resources at the right time. In fact, Connector managers boost employee performance by up to 26% and more than triple the likelihood that their employees will be high performers. “Connector managers give targeted coaching and feedback in their areas of expertise, but they recognize that there are skills best taught by people other than themselves,” said Sari Wilde, managing vice president in the Gartner HR practice.

Contemporary Front-end Architectures

Web applications have evolved from simple static websites (two-tiered architecture) into complex multi-layered SPA and SSR driven API first systems. CMS systems have grown into Headless content-first systems. Front-end community has changed rapidly in recent times. It started with DOM infused algorithms introduced by jQuery, which was quickly succeeded by MVC based Backbone.js. And, in no time, we found ourselves in the jungle of bidirectional and unidirectional data flow architecture. Somewhere, we lost the track of how we got here. How did the world that was so drenched in MVC suddenly got into React pioneered unidirectional data flow? What is the correlation? As we progress, we will attempt to unlock this puzzle. Though aimed at front-end engineers, the article should help any web developer seeking a general understanding of modern web application architecture. Software Architecture underpins large number of activities — process, requirement gathering, deployment topology, technology stack, etc. However, that is outside the scope of this article.

Top 5 nontechnical skills for cloud computing success

Legal implications in cloud computing aren't just about compliance and regulation like HIPAA or GDPR. Rules on how to handle data are important, but there's also value in hiring employees who can provide general legal advice around cloud computing. Questions can arise outside the purview of compliance, such as what the tax regulations are around cloud providers. For instance, in some situations, it does not make sense to displace existing hardware with cloud services. If that hardware hasn't fully deprecated, the associated tax benefit has not been fully realized. This can lead to a net loss that's much higher than any savings that might come from switching to the cloud. Other legal issues include how software licenses are transferable and adherences to service-level agreements. Most people who advise on cloud law are lawyers, although some are converted project managers.

Why CIOs should take extra precautions when buying IT support

“It is important to carefully evaluate primary support capabilities, such as the breadth and depth of the support team in each global region, the comprehensiveness of the service offering, and experience and scope in delivering vital tax, legal and regulatory updates, as well as strategic capabilities like modernisation and cloud services, hybrid IT, business-driven roadmap planning and application management services,” she said. “The right partner will help you to maximise the value of your existing applications and create the capacity to fund your modernisation, as well as free up resources to focus on transforming your IT systems,” Phelan added. “As such, another consideration for enterprises evaluating third-party support is how providers are investing in their proposition. The expertise and talent they bring into the organisation, and the innovative new services they offer, are key to helping companies transform.”

Six Degrees of Application Architecture

With application architecture, there is often a high-level approach or cycle that flows through the process of design and early development/prototyping. As time passes, we get better by leveraging frameworks and patterns that help reduce boilerplate or duplicate efforts in the design process. Think about it, since ORM frameworks have been introduced, few are spending time writing code that provides what Hibernate (as an example) provides. Even improvements to languages like Java cut down on the repetitive code which used to be required to process a list of objects. ... It is rare for work to be put into place to update those existing applications to take advantage of the new service. Even if the existing applications are not RESTful based, adding the functionality to make a call over HTTP is typically not that involved — especially if the value gained by the legacy application is significant. Most of the time the reason I encounter for such tasks not getting completed is tied to budgeting costs for the updates, validation/testing, and deployment.

Quote for the day:

"A leadership disposition guides you to take the path of most resistance and turn it into the path of least resistance." -- Dov Seidman

Daily Tech Digest - September 23, 2019

Artificial intelligence in marketing: when tech converges with the traditional

Artificial intelligence in marketing: when tech converges with the traditional image
The first step is to identify what is meant by a ‘buying coalition’. A buying coalition more accurately reflects the temporary alliance of distinct personas that are assembled to make a purchasing decision. Before, organisations don’t have automated contact creation, and it is only available in the sales reps mind. Now, organisations are able to more accurately target the right buyers at the right time. By applying AI to CRM data, organisations are able to create full contacts for all potential buyers engaged in the decision-making process by capturing relevant activity and associating with the correct opportunity. AI can then track all touch points with various opportunity contacts, analyse how they were engaged and who was engaged before and after them to show both the optimal number of buyers needed to close a deal and also how to sequence and communicate to these buyers in order to build a strategic coalition. This can only be done once AI has mapped the CRM data to the correct opportunity. We’re able to do this with our own persona-driven analysis as well.

Navigating the .NET Ecosystem

While it’s easy to get caught up in the past, and grumble over previous concerns and frustrations, we must move forward. Perhaps, arguably one of the most logical paths forward is to unify .NET Core and .NET Framework ... dare I say, "Let’s make .NET great again!" Maybe, I’ve gone too far, but let’s discuss the future. Where is Microsoft steering us? Let’s take a step back for a moment and discuss where we’ve come from, before diving into where we’re going. Not all .NET developers are aware of how their code compiles, and what is truly produces. "From the very beginning, .NET has relied on a just-in-time (JIT) compiler to translate Intermediate Language (IL) code to optimized machine code." — Richard Lander Revisiting my earlier mention of the Mono project, we know there have been significant efforts around making an ahead-of-time (AOT) compilation for .NET. Mono has achieved with its industry-leading LLVM compiler infrastructure. 

Since last year, Google has been working with online developer education company Udacity to provide free lessons and has now packaged them as 'codelab courses that are formatted like tutorials'.  The courses are aimed at developers who have some experience in programming object-oriented, statically typed languages like Java or C# and who've used IDEs such as JetBrains' IntelliJ IDEA, Android Studio, Eclipse, or Microsoft's Visual Studio. Students will need to install the Java Development Kit (JDK) and IntelliJ. Google promotes Kotlin as a "concise" and "modern object-oriented language [that] offers a strong type system, type inference, null safety, properties, lambdas, extensions, coroutines, higher-order functions".  It started offering free courses last year via the Kotlin Bootcamp course and is now offering them in the Google Developers Codelabs format. "Google and Udacity currently offer video-based courses for Kotlin Bootcamp and How to build Android apps in Kotlin," said Jocelyn Becker, senior program manager of Google Developer Training.

Your competitive edge: Unlock new possibilities with Edge computing

Today more than ever, there is little tolerance from employees, clients, or consumers when it comes to IT failures - fortunately we are entering a new age of technologies that will minimize this risk. Software defined networks (SDN) are becoming increasingly adept at identifying specific customer needs and running workloads to locations that best serve specific cost, latency and security requirements. When combined with the potential of AI to inform decision making, we can see a perfect storm of capabilities that will deliver a step change in how computing and networks converge to deliver a personalized service to customers who embrace these technologies. The added capability of MEC provides an extra layer, protecting essential services from outages or connectivity issues stemming from rare, but damaging ISP failures or cloud server downtime. This approach to network architecture can also enable local hosting of data, allowing data privacy to be better governed.

New application security risks lead IT teams to DevSecOps

This type of organizational change doesn't come easily, but progress is possible, Ewert said. "When I started four years ago, there was a real 'us vs. them' mentality between developers and security," he said. "It took us three years to get the culture to the point where they approach us [for help]." That shift came about through training sessions and explanations for why certain security rules and practices are necessary, Ewert said. The SecOps team also found unobtrusive ways to dovetail its efforts with the rest of the organization's; they piggybacked on an IT monitoring project that rolled out standardized monitoring agents across the organization to get security monitoring agents installed, for example. It has also been helpful for Ewert's team to frame security recommendations in terms of how they can make the application and infrastructure more efficient, such as by avoiding costly distributed denial-of-service attacks. "We only block a release if it's absolutely critical.

Data Everywhere At the Edge of Compute & Energy Efficiency

In traditional or classic software development, or SW 1.0, a typical project for a software team may involve creating algorithms and writing millions of lines of instruction code. SW 2.0 is a new way of thinking in which value creation comes not from code writing, but from data curation. We collect data from our devices, select relevant data sets, verify and label them. We then use that “curated” data to train machine learning (ML) models. Imagine the data involved in running a large printing press. Curating data from a printing press might include engineers looking at images of final press output, finding defects (e.g., lines, splotches, roller marks, etc.), then training an ML model to detect those defects. The model “runs” in real-time at the press to monitor output. With SW 2.0, simple problems are quickly trained, and issues that might otherwise be impossible just take a bit longer.

Enterprises can hire software robots at the push of a button

The IT services roles are internal, providing services to IT teams, but most of the others in banking, insurance and retail are customer-facing, with bots being the first point of contact for customers making enquiries. Dube said about 112,000 customer calls a day are handled by IPsoft customer Telefonica in Peru using Amelia. “These calls are made when people have a problem with their account,” he said. “Humans have to be liberated from these chores.” Dube described the jobs being done by robots as “high-friction, low-margin roles”. They are jobs that have to be done, have high costs, but businesses do not make money out of them. Firms are replacing large numbers of staff in such roles with software robots. The World Economic Forum’s Future of jobs 2018 report predicted that 75 million current jobs will be automated by 2022, and that 52% of jobs today will be done by robots by 2025.

Interview with Scott Hunter on .NET Core 3.0

Many times when we talk about NET Core 3.0, we talk about the new desktop support, but there is also a lot of innovation in ASP.NET. First up, while we are not bringing back WCF, we know that many developers want to program high-performance, contract-based RPC services in their applications. We are embracing the open-source gRPC project for these workloads. We are working to make the .NET implementation first class, and because it is gRPC, it works with many other programming languages as well. There is a new microservices-related Worker Service project for building lightweight background workers, which can be run under orchestrators like Kubernetes. Also, while ASP.NET has great support for building API’s, we want to make it easy to put rich security on top of them, so we are adding bridges to use our API’s with the open-source Identity Server project. Finally, we are working on Blazor, which enables developers to build high-performance web application using .NET in both the browser and server, using Web Assembly.

Innovation: How to get your great ideas approved

Historical precedent suggests that a top down approach to innovation can produce big benefits. Harvard Business Review (HBR) tracked the inventing history of 935 CEOs at publicly-listed US high-tech companies and found that one in five of these successful firms has what it refers to as an inventor CEO, who are bosses that have invented at least one patent. While the research is focused on the high-tech sector, HBR concludes that boards of directors should pay close attention to the inventor credentials of their executive teams. In the case of RBS, Hanley believes the senior team's role is to help point out opportunities and ensure any innovations help the business keep pace with a fast-changing finance market. "I guess our job is to not only think about the future, but almost work backwards from the future – we need to understand and have a point of view as to how we think the world is changing," says Hanley. "We need to articulate that view to all of our key stakeholders and then make sure we do something about it."

Cloud security: Weighing up the risk to enterprises

A good analogy is that clouds are like roads; they facilitate getting to your destination, which could be a network location, application or a development environment. No-one would enforce a single, rigid set of rules and regulations for all roads – many factors come into play, whether it’s volume of traffic, the likelihood of an accident, safety measures, or requirements for cameras. If all roads carried a 30 mile per hour limit, you might reduce fatal collisions, but motorways would cease to be efficient. Equally, if you applied a 70 mile per hour limit to a pedestrian precinct, unnecessary risks would be introduced. Context is very important – imperative in fact. The same goes for cloud computing. To assess cloud risk, it is vital that we define what cloud means. Cloud adoption continues to grow, and as it does, such an explicit delineation of cloud and on-premise will not be necessary. Is the world of commodity computing displacing traditional datacentre models to such an extent that soon all computing will be elastic, distributed and based on virtualisation? 

Quote for the day:

"Leadership does not always wear the harness of compromise." -- Woodrow Wilson