Daily Tech Digest - March 14, 2019

A second 737 Max crash raises questions about airplane automation


Instead of improving safety, innovations can allow airlines “to run greater risks in search of increased performance.” A high-ranking Boeing official told the Wall Street Journal that “the company had decided against disclosing more details to cockpit crews due to concerns about inundating average pilots with too much information—and significantly more technical data—than they needed or could digest.” But what good is a safety system that’s too intricate for highly trained professional airline pilots to understand? Each new automatic device, Perrow wrote, might solve some problems only to introduce new, more subtle ones. Make the system too complicated, he said, and it’s inevitable that regulators will lose track of which pilots had been told what, and that some pilots will get confused about which procedures to follow. It didn’t, he said, make much sense to blame pilots in cases like this. Pilot error, he said, “is a convenient catch-all.” But it’s the complexity of the system that’s really to blame.



Observability in Testing with ElasTest

In a distributed system, finding the root cause of a bug is definitely not easy. When I face the problem, I’d like to be able to compare a success execution with a failure one, side by side. This is difficult due to the nature of logs, which can vary between executions. Such a tool should be able to identify the common patterns and discard the irrelevant pieces of information that might vary between two different executions. This comparison feature should be available as well for any other kind of metric. If I can compare the memory consumption or latency of requests of two consecutive executions I might be able to understand why the second failed. In general, we need more specific tools for this task. In a testing environment, we have more control as to what information to store but we need to raise awareness within the tools about the testing process. This way, we can gather the necessary information during test runs, and provide the appropriate abstractions to understand why a specific test failed.


The Effect of The Data Revolution in Enterprise Software Development

enterprise software development
The advent of Big Data marks the rebirth of enterprise software. In the traditional model, it was a common practice for the entire enterprise to adapt around the software they used. In a recent survey, 80% of executives who used traditional software responded that the software negatively affected their company’s growth and that it wasn’t flexible enough to adapt to their changing needs. Meanwhile, Big Data made possible custom software development that works for the enterprise and moves with it. Focusing on short learning curves and intuitive interfaces, modern, data-driven software is empowering, not challenging. Enterprise software development now fuels innovation and boosts workplace productivity, preparing businesses for the digital age. Every business faces unique challenges and now, thanks to Big Data, dedicated enterprise software can address these challenges and modernize workflows. When bottlenecks are eliminated, all departments can collaborate seamlessly, utilize resources to the maximum, and stay agile across all project stages.


Security is a constant battle: Be the Secure Code Warrior in your organisation

Image 1 for Security is a constant battle: Be the Secure Code Warrior in your organisation
The skills you learn using Secure Code Warrior cover over 150 types of vulnerabilities, including the OWASP Top 10 – this is very important to note. The Open Web Applications Security Project is a global organisation that has done tremendous work over the years in formalising and promoting best practice in application security. Amongst other things, it provides the OWASP standard that is a baseline of impartial, practical and cost-effective security best practice that can be used to establish a level of confidence in application security both within an organisation, and to external parties (list customers and regulators) to demonstrate a commitment to security. The importance of the alignment to the OWASP standard is that it is non-vendor specific, industry recognised and widely respected, and is not simply learning a single tool. The Secure Code Warrior platform teaches real-world skills, in the coding language of your choice, that is applicable to every industry on every platform, be that Enterprise, Cloud, Mobile or IoT.


How WebAssembly will change the way you build web apps

WebAssembly isn't a finished product, with plenty of room for improvement in both its support, features, and performance. "WebAssembly is young, what has landed in the browser right now is certainly not a fully mature product," says Williams, giving the example of garbage collection not being implemented in WASM as yet. "If you've started working with WebAssembly now, you'll immediately going to go 'Why is my WASM so big and why is it not as fast as I want it to be?'. "It's because it's a new technology, but that being said it is designed to be faster than JavaScript and it is often faster, and, simply by function of it being an instruction set, your programs are going to be significantly smaller." Williams is bullish on the prospects for WebAssembly, and with many people looking to the future of the web as it celebrates its 30th birthday, she has high hopes for how WebAssembly might transform the platform.


Container management tools must overcome limitations


"A lot of different pieces are coming together and driving new opportunities, as well as creating new challenges [with container management]," said Stephen Elliot, program vice president at IDC.  Much like virtualization, containers move application development one layer away from the computing infrastructure, so developers spend more time enhancing applications and less time configuring system resources. This approach fits with today's continuous development mantra, which stresses speed. Consequently, containers are becoming quite popular. In fact, the application containers market will create $4.3 billion in revenue in 2022, according to 451 Research's November 2018 Market Monitor study on application containers. When companies deploy containers, many find they require new container management tools to use them at scale. "Monitoring container platforms means managing numerous moving parts, from a security, performance, compliance and availability perspective," said Torsten Volk


Culture & Methods – the State of Practice in 2019


It is only with cultural change that the promise of the new ways of working will be achieved. Many organizations are embarking on “Digital Transformation”, and it is often the same organization which has undergone two or three “Agile Transformations” in the past without seeing the promised benefits. We believe that this is because the adoption is often implemented to “pay lip-service” to the idea, and is shallow rather than truly transformational. According to the State of Agile report, agile software development has become a late majority approach; almost all software is now built using iterative and incremental approaches, and mainly using some derivation of Scrum or Kanban. The strong technical practices from eXtreme Programming are still the exception rather than the norm in early and late majority firms. As an industry, we may know how to build software better, but there isn’t the appetite to truly empower the teams and make the organisational changes needed to actually achieve the outcomes that Innovators have shown is possible.


Millions hit by major Android-based malware campaigns


After installation, the malware then connects with the command and control server to receive orders. These may range from opening a browser with a given URL to removing the app icon from the launcher. The app's three-pronged capabilities include showing ads, opening phishing pages, and exposing users to other applications. The attackers are also able to install a remote application from a designated server, allowing them to further infect users with malware at their discretion. "With the capabilities of showing out-of-scope ads, exposing the user to other applications, and opening a URL in a browser," the researchers said, "'SimBad' acts now as an Adware, but already has the infrastructure to evolve into a much larger threat." CheckPoint Research also outlined 'Operation Sheep' in a second report yesterday. This involves a group of Android apps harvesting contact information from users' phones on a mass scale without their consent. This malware has similarly been loaded in an SDK built for data analytics, and has been seen in up to 12 different Android apps to date.


Standardising The Data Scientist

Data scientist
With the data scientist categorised as an endangered species, it is difficult not only for organisations to recruit people into these roles, but also to ensure that they have the skills and expertise required for the job in hand. For James de Raeve of The Open Group, this can be attributed to a lack of standardisation in the profession. “There needs to be a common understanding around the skills and experiences of professional data scientists,” he says. “This way, there is guidance to help individuals grow their professional competence, while organisations gain support in building career models that meet the requirements of the business now and in the future.” In an effort to remedy this, The Open Group and IBM have come together to create a new certification programme for data science. Through the programme, individual data scientists can certify their skills via the process of peer review, thereby building trust in their profession. Similarly, at the business level, organisations can not only ensure that job opportunities are filled appropriately, but also develop individuals through their own, accredited programme


Power BI Security

Power BI uses two primary repositories for storing and managing data: data that is uploaded from users is typically sent to Azure BLOB storage, and all metadata as well as artifacts for the system itself are stored in Azure SQL Database. The dotted line in the Back-End cluster image, above, clarifies the boundary between the only two components that are accessible by users (left of the dotted line), and roles that are only accessible by the system. When an authenticated user connects to the Power BI Service, the connection and any request by the client is accepted and managed by the Gateway Role (eventually to be handled by Azure API Management), which then interacts on the user’s behalf with the rest of the Power BI Service. For example, when a client attempts to view a dashboard, the Gateway Role accepts that request then separately sends a request to the Presentation Role to retrieve the data needed by the browser to render the dashboard.



Quote for the day:


"Enthusiasm spells the difference between mediocrity and accomplishment." -- Norman Vincent Peale


Daily Tech Digest - March 13, 2019

Wearable tech in the enterprise grows, but few workplace uses exist
More interesting than the growth predictions, perhaps, the report splits enterprise wearable use into two parts: employee use and customer applications. The top employee uses include workplace security, employee time management, and employee communications. From my perspective, though, much of that sounds like messaging and checking the time. The customer application side seems more promising. Common customer applications predicted include loyalty programs, point-of-sale (PoS) uses and something called an “integrated shopping experience.” I don’t know about you, but I’m not seeing a lot of smartwatch-based loyalty programs or PoS applications. I have no idea what an integrated shopping experience is, but I’m pretty sure I haven’t seen one on of those on a smartwatch, either. The report cited the lack of applications as the biggest barrier to B2B adoption of wearable tech, and that problem doesn’t seem to have disappeared. Nor have other concerns in the report, including cost, device capabilities, and security.



Cyber attackers favouring stealthier attacks, says Darktrace


Despite the rise in popularity of cryptojacking, however, banking Trojans once again appear to be the most profitable tool for cyber criminals, according to Max Heinemeyer, director of threat hunting at Darktrace. “Unlike ransomware, banking Trojans do not rely on a victim’s conscious willingness to pay. Instead, they use deception to perform transactions without the victim’s knowledge. Given the decline in ransomware incidents in 2018, it seems that subtler attacks have become the weapons of choice for hackers,” he said. In one Fortune 500 e-commerce company, Darktrace discovered a privileged access user – a disgruntled systems administrator – was hijacking power sources from the company’s infrastructure for monetary gain. The employee co-opted other users’ credentials and service accounts to stealthily take over multiple machines for the purpose of cryptomining. Darktrace’s 2018 threat data also revealed that more than 15% of internet of things (IoT) devices detected by its AI technology were unknown to businesses concerned, with a 100% year-on-year increase in IoT attacks.



Researchers build nanoscale distributed DNA computing systems from artificial protocell


Living cells communicate with each other by sending and receiving molecular signals that diffuse between neighboring cells to activate key molecular processes. This communication enables cell populations to implement collective information processing functions that cannot be achieved by individual cells in isolation. Although synthetic biologists have made significant progress in engineering cell populations to perform computation, such engineering still remains a major challenge because of the complex interplay between synthetic devices and natural cellular processes. In a new paper published in the journal Nature Nanotechnology, a team of researchers led by Tom de Greef at the Eindhoven University of Technology and Radboud University, Stephen Mann at the University of Bristol, and Andrew Phillips at Microsoft Research present a method for implementing distributed DNA computing systems by compartmentalizing DNA devices inside artificial protocells.


Data and analytics offer a seismic opportunity in manufacturing

Data and analytics offer a seismic opportunity in manufacturing image
Until recently, predictive maintenance was a time consuming, manual process requiring large teams of data scientists. It was used in industries where regulation demands it, as well as by a small number of pioneering industrialists to monitor their most important production assets. Although predictive maintenance was proven to keep aircraft in the air and the most vital wheels of industry turning, the vast majority of manufacturers couldn’t capitalise on the opportunity due to the high cost and complexity of gathering and analysing sufficient data to drive tangible results. Things started to move on, however, with the adoption of IoT sensors and machines capable of recording their own vital statistics. Manufacturers can now collect large amounts of data from across their production environments and relay to factory historians or third-party platforms such as Siemens MindSphere and the OSIsoft Pi System. The ease with which this can be done has led to a data rush. Around two-thirds of manufacturers today are gathering data from their production environments.


Why Every Company Needs A Data Strategy For 2019

Why Every Company Needs A Data Strategy For 2019
Creating a data strategy isn’t a standalone activity; it must be driven by your overarching business strategy. Therefore, a critical starting point for any data strategy is the business’s strategic objectives. To put it another way, what is your business trying to achieve and how can data help you get there? After all, what's the point of a data strategy – indeed, what's the point of data in general – if it doesn't help you achieve your organizational goals? So before you charge ahead to your data strategy, review your business strategy first and then develop your data strategy ... When you use data to make smarter decisions, optimize business process and so on, it's likely to have a positive effect on the bottom line. But this link between data and the bottom line can also be much more direct. In other words, data can be monetized to increase revenue and create a new income stream. This monetization may take many forms. For example, it may involve bringing new, data-driven products or services to market. Or it may involve selling data to customers through optimized services.


The NSA Makes Ghidra, A Powerful Cyber Security Tool, Open Source


The NSA announced Joyce’s RSA talk, and Ghidra’s imminent release, in early January. But knowledge of the tool was already public thanks to WikiLeaks’ March 2017 “Vault 7” disclosure, which discussed a number of hacking tools used by the CIA and repeatedly referenced Ghidra as a reverse-engineering tool created by the NSA. The actual code hadn’t seen the light of day, though, until Tuesday—all 1.2 million lines of it. Ghidra runs on Windows, MacOS, and Linux and has all the components security researchers would expect. But Joyce emphasized the tool's customizability. It is also designed to facilitate collaborative work among multiple people on the same reversing project—a concept that isn't as much of a priority in other platforms. Ghidra also has user-interface touches and features meant to make reversing as easy as possible, given how tedious and generally challenging it can be. Joyce's personal favorite? An undo/redo mechanism that allows users to try out theories about how the code they are analyzing may work, with an easy way to go back a few steps if the idea doesn't pan out.


Unbelievable Ways Artificial Intelligence Is Revolutionizing Education

ways artificial intelligence is revolutionizing education
How have teachers been developing their careers in the new technological age? In the past, they did it with the help of specialized workshops, seminars, or courses. Unfortunately, professors have little time for self-development because they must come up with an individual approach to their students. The development of both AI and global education ecosystems can change this situation. For example, American Institutes for Research (AIR), provides teachers with online training courses and webinars to share experiences with colleagues. Another solution is The Global Education Conference, a project that gives access to educational conferences and hundreds of courses for professional growth. Such innovations can deliver astonishing results. For teachers, AI has become synonymous with expanding the boundaries of how you can share experiences and knowledge. ...  An intellectual curriculum can also be developed on a turnkey basis for a particular student. In this case, there is no need to buy costly textbooks and materials.


You can’t benchmark culture


A benchmarking exercise is one of analyzing and copying the things another company does. (Companies can also benchmark their own current performance against past performance.) That won’t work for cultural elements, because every company’s cultural situation is as unique as a fingerprint. It incorporates emotionally resonant, deeply embedded perspectives and habits that have built up through years of challenges and experience; these factors can’t be easily separated from one another. Moreover, these elements have to fit the company’s strategy and core capabilities, or the company won’t be able to continue delivering value. The behaviors and emotions that should be emphasized in one company may be precisely those that would hold another company back. For example, at PetroChem, the leaders sought to borrow a new behavior from Polymer Plus and Google: rapid launches of imperfect products, which would be improved later, after the product had been on the market.


NVMe startups’ different routes to flash performance gains


All the startup solutions presented so far are based on new hardware designs, but another route being taken by startups such as Excelero and WekaIO is to eliminate the storage hardware and go completely software-defined. Of course, there has to be some storage hardware somewhere, but the benefit of the software-defined architecture is that solutions can be implemented either as a hyper-converged infrastructure (HCI), dedicated storage or even in public cloud. Excelero has created a storage solution called NVMesh that implements NVMe-over-Fabrics through a proprietary protocol called RDDA. Where RDMA connects multiple servers together and provides direct memory access, RDDA takes that step further and makes NVMe storage devices accessible across the network. This is achieved without the processor of the target server and so delivers a highly scalable solution that can be deployed in multiple configurations, including HCI.


Hackers Love to Strike on Saturday

Hackers Love to Strike on Saturday
Attackers were more likely to strike on a Saturday than any other day. This makes obvious sense, since organizations are likely to have fewer employees minding the shop. After a successful attack, hackers would also have more time to explore a network before employees returned to work on Monday. One caveat, however, is that in some parts of the world, the weekend begins on a Friday. In Bangladesh, for example, Friday is a Muslim day of prayer. Not coincidentally, that's the day attackers hit Bangladesh Bank. ... Want to bury bad news? One age-old strategy long practiced by businesses and politicians in the U.s. is to release bad news on a Friday, after markets have closed, in the hope that it will be old news by Monday. Many breached businesses, likely not coincidentally, publish their first public data breach notification on a Friday. Redscan found that half of all breach reports were submitted to the ICO on a Thursday or Friday, suggesting that this ethos exists in the U.K. as well.



Quote for the day:


"Leadership in today's world requires far more than a large stock of gunboats and a hard fist at the conference table." -- Hubert H. Humphrey


Daily Tech Digest - March 12, 2019

The 3 surprising secrets that drive innovation in the digital era

3 lightbulbs with the number three
It's inevitable, hear the word innovation, and you immediately start thinking about technology. After all, innovation and technology have been nearly synonymous for most of the last two decades. This inclination is even more likely if you're an IT professional, given our natural fondness of technology. But if you want to transform your organization into an innovation machine, the place to start is with the recognition that innovation is not, in fact, about technology at all. ... The way Ubels discussed what the company was doing was illuminating. “I love technology, but it’s about building better buildings for the world,” he explained during a subsequent conversation we had on the subject. “It’s healthy, sustainable, the best working environment for employees. There’s a war for talent and a building is an important part of how you express yourself as an organization and a building that people like to go to.” Here was the person responsible for the technology at a company that had made technology a central component to its value proposition — and there was almost no talk about technology either from the keynote stage or during our conversation.



5 steps to performing an effective data security risk assessment

A threat is anything that has the potential to cause harm to the valuable data assets of a business. The threats companies face include natural disasters, power failure, system failure, accidental insider actions (such as accidental deletion of an important file), malicious insider actions (such as a rogue agent gaining membership to a privileged security group), and malicious outsider actions (such as phishing attacks, malware, spoofing, etc.). Each company should have its central risk team determine the most probable threats and plan accordingly. ... A vulnerability is a weakness or gap in a company's network, systems, applications, or even processes which can be exploited to negatively impact the business. Vulnerabilities can be physical in nature (such as old and outdated equipment), they can involve weak system configurations (such as leaving a system unpatched or not following the principle of least privilege), or they can result from awareness issues. Similar to determining threats, analyzing vulnerabilities is also best completed by the central risk team.


The buzz at RSA 2019: Cloud security, network security, managed services and more

The buzz at RSA 2019: Cloud security, network security and more
Remember a few years ago when we were all shocked by dual exhibition floors in Moscone north and south? Well, the RSA conference addressed this by making one contiguous show floor in and between both buildings. Why so many vendors? Because every individual technology in the security technology stack is in play, driven by things like machine learning algorithms, cloud-based resources, automation, managed services components, etc. All these vendors may be a boon to industry trade shows, but they are confusing the heck out of cybersecurity pros. Instead of buzz words and hyperbole, successful vendors will invest in user education and thought leadership, offering guidance and support for customers and prospects. ... Large cybersecurity vendors are jumping on this trend with integrated cybersecurity technology platforms and moving toward enterprise license agreements and subscription-based pricing. Many of the vendors I met with are now tracking multi-product deals and incenting direct sales and distributors in this direction.


Applying Artificial Intelligence in the Agile World

There are a growing number of customer service software products that let you combine your existing knowledge base support with chatbots to provide pre-canned and self learning responses to customer queries. This is a great way to get started with experimenting with self learning capabilities. Recommendation systems as popularised by Netflix’s movie recommendation feature have made significant advancements in recent years. These can be easily integrated into existing systems to add self learning capabilities. For example, collaborative filtering systems can collect and analyze users' behavioral information in the form of their feedback, ratings, preferences, and feature usage. Based on this information, these systems exploit similarities amongst several users to suggest user recommendations. The emergence of operational chatbots as popularised by github’s open source project hubot are changing the traditional operations paradigm. Work that previously happened offline is now being brought into chat rooms using communication tools such as slack.


Cloud monitoring, management tools come up short


Cost and complexity were the top reasons given for cloud-monitoring failures. Forty-five percent said cloud support required additional software licenses or network monitoring tool modules, which they didn’t want to pay for. Forty-four percent indicated that cloud support in their tools was too difficult to implement or use. They simply couldn’t get value out of the updated tools. “Due to complacency and limitations of the software itself, we had to get rid of [a tool],” one IT executive at a North American distributor of heavy, manufactured products told EMA. “It’s not worth the time and investment. We didn’t want to spend more money on a new version that was just a redux of an older version. I didn’t see any real progress in the product.” Furthermore, 35 percent said their vendors had done a poor job of adding cloud-monitoring support to their tools, with the functional updates failing to meet their needs. And 28 percent said their vendors had failed to even establish a roadmap for cloud monitoring.


2019’s Most Inquired Professional Services Marketplace Model

Be it the medical services, freelancing services, travel or hospitality services to name a few. In whichever specifics the services marketplace may be, it’s prime role is to connect the people with service providers. Thumbtack, TaskRabbit, Handy.com and many more service marketplaces are becoming routine names for people. It literally took a good ten years for the customers to warm up with the idea of services marketplaces. With experiencing a lot many varied economic models, the services marketplace industry has undergone several phases of evolution. On this note, it becomes vital for companies to have a killer business model to lead and survive in the competition marathon. A number of businesses have recognized the essential aspects that contribute to design a lucrative business model. This blog gives a firsthand look to these key elements of the professional services marketplace model that’s pretty perfect for a services marketplace.


Get started with natural language processing APIs in cloud


With the popularity of voice assistant technologies, natural language processing APIs and similar services have become one of the most in demand -- and better understood -- subdisciplines of AI. There are decades of research to support the field, and it's used in countless products to analyze speech and text for language and sentiment, improve the ability to search unstructured data and even parse intent from conversations as they happen. Natural language processing has only recently become affordable enough to productize for the general public. Today, it is so commonplace that the major cloud providers -- as well as a number of smaller players -- offer it as a service. Each vendor has its own feature set to process natural, human-readable text. Let's review some of the most prominent natural language processing APIs and cloud-based services, as well as ways developers can incorporate them into applications.


Why CISOs Need Partners for Security Success

More and more CISOs are buying into the strategy of involving members of the C Suite as well as other leaders in key projects, Pescatore said. For instance, CISOs at power plants and other large manufacturing facilities are working with COOs to show how business results are affected when systems are offline due to a ransomware attack or another type of cyberattack, clearly demonstrating why there's a need for better security to improve reliability and resistance in the face of an interruption. ... The security team may not understand the goals of the development team and may lack the skills to keep up with the rapid pace of application development, Pescatore explained. "So the slowdown is really two things," Pescatore told me after his presentation. "The first is not understanding how the business works. It's about saying no to everything when sometimes there's no risk that anyone will care about. The second is skills - the security team might not be up to the task of going as fast as the other side."


How to shop for CDN services

nw how to shop for cdn shopping cart
Content delivery networks are the transparent backbone of the Internet that bring users every piece of content to their PCs or mobile browsers – from news stories to shopping sites to live-streaming video. For more than a decade, a content delivery network’s primary mission has been to reduce latency by shortening the distance between a website’s visitor and its server. Today, however, the stakes are much higher. Skyrocketing streaming demands, growing consumer impatience, spikes in global live viewership, and shifting device preferences are all changing CDN services, according to a study by streaming platform Conviva. Its users’ overall viewing hours increased 89 percent in 2018, including a 165 percent jump in streaming TV viewership in the fourth quarter alone, according to the study. Live content drove much of the surges, including a 217 percent spike in U.S. news watching during November’s mid-term elections.  At the same time, rising expectations about video streaming quality have viewers more impatient than ever.


How AWS, Azure and Google approach service mesh technology


Some users only want service mesh connectivity and load balancing for their microservices. Here, Microsoft users will want to consider Azure Service Fabric. It supports deployment on other public clouds, which makes it the top service mesh for multi-cloud. Also consider Google's Kubernetes Engine and Istio, particularly if you're a Kubernetes shop. Amazon's basic service mesh tools are great for AWS users, but less versatile in multi- and hybrid cloud deployments. The middle ground, where most users will probably find themselves, is a bit more difficult to read at this point. Microsoft and Google have signaled they'll support a fairly portable service mesh vision via Azure Service Fabric and Google's Kubernetes-Istio combination, respectively. Amazon's middle ground is still divided and somewhat primitive compared to its competitors, which likely means more upgrades are on the way. In the long run, service mesh, managed container services and even serverless are likely to converge into a single uniform resource model for applications.



Quote for the day:


"Perhaps the ultimate test of a leader is not what you are able to do in the here and now - but instead what continues to grow long after you're gone" -- Tom Rath


Daily Tech Digest - March 11, 2019

A primer for CIOs needing ‘deep learning’ on the benefits on emerging tech

hands hold a tiny string of lightbulbs / small or emerging ideas / brainstorming / innov
Davenport suggests that CIOs and business leaders need to look at AI through the lens of business capabilities versus the lens of technology. He says that process automation focuses upon the automation of digital and physical tasks using RPA. More importantly, Davenport says RPA is the least expensive and easiest to implement. This is an efficiency, (coupled with consistency and standard) time saving and integral part of any digital process. Davenport contrasts RPA with “cognitive insight” and “cognitive engagement”. Cognitive insight uses algorithms to detect patterns against data sets with vast volumes and with this, interprets their meaning. Insights here are typically provided by machine learning. These supervised or unsupervised approaches are data intensive and detailed. Models are typically trained on a portion of a data set. Models gets better, in other words, make predictions as they use new data. Clearly, the more data the better especially for things like facial recognition. Cognitive Insights often use a version of machine learning, deep learning which attempts to mimic the human brain to recognize patterns.


Cybercrime is increasing and more costly for organizations

Cyber attacks are evolving from the perspective of what they target, how they affect organizations, and the changing methods of attack, according to the study, which is based on interviews with 2,647 senior leaders from 355 companies across 11 countries and 16 industries. Information theft is the most expensive and fastest rising consequence of cyber crime. However, data is not the only target. Core systems such as industrial controls are being hacked in a dangerous trend to disrupt and destroy, the report said. While data remains a key target, theft is not always the outcome of an attack. A new wave of cyber attacks sees data no longer just being copied but being destroyed or changed, in attempts to breed distrust. Attacking data integrity is the next frontier of cyber threats, the report said. Cyber criminals such as hackers are adapting their attack methods. They are aiming at the human layer, which the researchers said is the weakest link in cyber defense, through increased ransomware and phishing and social engineering attacks as a path to entry.


Gen Z: The Challenges and Opportunities with New Talent from a New Generation


Unlimited access to technology and the internet has led Gen Zers into a mindset of hyper customization. Young people today are much less willing to follow a straightforward career path. Gen Zers have seen that only about 27 percent of college graduates are currently working in the field of their major and this has led them into wanting to take a more proactive role in deciding and designing their career path. In fact, the Society for Human Resource Management finds that 56 percent of all Gen Zers say that want to write their own job description. While the dream for Gen Zers might be running their own company as a route to financial security and wellbeing, they will also covet opportunities to innovate and create value for the companies they work for. Lastly, as fully digital natives, the Gen Z workforce will obviously search for companies that offer the most advanced technological developments within their respective workplaces.


How Artificial Intelligence Could Transform Medicine

Dr. Topol believes that A.I. can do more than enhance diagnoses and treatments. It can also save doctors from doing tasks like taking notes and reading scans, allowing them to spend more time connecting with their patients. Recently, we caught up with Dr. Topol to discuss his thoughts on where A.I. has the most potential to improve health care, where it might stumble, and how it could protect doctors from things like burnout and depression. Here are edited excerpts from our interview. ... For the first time we’ve got real-time, objective metrics for state of mind and mood like tone of speech, breathing pattern, smartphone keystrokes and communication, and physical activity. And we’ve learned people would rather share their innermost secrets with an avatar compared with a human being. So, the landscape is ripe for A.I. to help alleviate the profound shortage of health professionals compared with the enormous burden of depression and other mental health conditions.


Deep Learning: When Should You Use It?


True, it is getting easier to use deep learning. Part of this is due to the ubiquity of open source platforms like TensorFlow and PyTorch. Then there is the emergence of cloud-based AI Systems, such as Google’s AutoML. But such things only go so far. “Each neural network model has tens or hundreds of hyperparameters, so turning and optimizing these parameters requires deep knowledge and experiences from human experts,” said Jisheng Wang, who is the head of data science at Mist. “Interpretability is also a big challenge when using deep learning models, especially for enterprise software, which prefers to keep humans in the loop. While deep learning reduces the human effort of feature engineering, it also increases the difficulty for humans to understand and interpret the model. So in certain applications where we require human interaction and feedback for continuous improvement, deep learning may not be the appropriate choice.” However, there are alternatives that may not be as complex, such as traditional machine learning.


Humans Wanted: Robots Need You

Manufacturing and production anticipate the most change: 25% of employers say they will employ more people in the near-term while another 20% say they will employ fewer – resulting in job growth together with significant skills disruption in the industry. Growth will come too in frontline and customerfacing, engineering, and management roles, all of which require human skills such as advanced communication, negotiation, leadership, management and adaptability.8 In other functions, administrative and office roles are shrinking and overall HR headcount is expected to stay the same. ... Demand for tech and digital skills is growing across all functions10 yet employers place increasing value on human skills as automation scales and machines prove better at routine tasks. While 38% of organizations say it is difficult to train in-demand technical skills, 43% said it is even harder to teach the soft skills they need such as analytical thinking and communication.


Citrix breach once again highlights password weaknesses


In a statement, Citrix said it has taken action to contain the breach, begun a forensic investigation and engaged a cyber security firm to assist. The software firm said it has also taken actions to secure its internal network. “While our investigation is ongoing, based on what we know to date, it appears that the hackers may have accessed and downloaded business documents. The specific documents that may have been accessed, however, are currently unknown. At this time, there is no indication that the security of any Citrix product or service was compromised,” the statement said. The notification, however, has sounded alarm bells for governments and military organisations, as well as the more than 400,000 organisations around the world that use Citrix products and services, raising fears that the their networks may be at risk of compromise. According to security firm Resecurity, Citrix was breached by the Iranian-linked group known as Iridium, which has hit more than 200 government agencies, oil and gas companies and technology companies.


Price transparency in healthcare requires accuracy via better use of technology

It’s the consumer-driven health plans, where patients are now responsible for more. They have to make a decision – “Do I buy my groceries, or do I have an MRI.” The shift in healthcare makes us go after the patient before insurance is paid 100 percent. Patients now have a lot of skin in the game. And they have to start thinking, “Do I really need this procedure, or can it wait?” ... It's actually a tremendous opportunity for technology to help patients andproviders. We live in an experience economy, and in that economy everyone is used to having full transparency. We’re willing to pay for faster service, faster delivery. We have highly personalized experiences. And all of that should be the same in our healthcare experiences. This is what people have come to expect. And that's why, for us, it’s so important to provide personalized, consumer-friendly digital payment options.


What to Look for in an AI Partner

Focus is not always enough. Does your potential partner have the expertise to actually solve your particular problem? Expertise is a complicated issue. Partners need a certain level of domain knowledge. The team assigned to your organization must possess an understanding of your unique pain points and overall business. It doesn’t have to be exhaustive, but every industry is unique in some way, whether it’s in terms of regulations or customer profiles or something else, and if your team is not familiar, it can lead to big problems later. At the same time, deep data science experience is also essential. The models are the foundation of every AI solution. They must be carefully constructed, and now for the super challenging part: They need to be packaged in consumer-grade software and delivered through services that can drive operational impact in a manner applicable to your domain. And expertise does not stop there. Your chosen partner needs to be able to map out a clear path to implementation.


Is Blockchain Enabler of Data Monetization?

A shared, distributed ledger (blockchain) has the following Big Data ramifications: Common access to the same data for all parties involved in the transaction. This should accelerate data acquisition, sharing, data quality, data governance and ultimately, data analytics. Provides a detailed register of all transactions or engagements kept in a single “file” or blockchain. It provides a complete view of the entire transaction, from engagement start to engagement finish. No need to integrate pieces of data from multiple systems in order to create a single view of the entire engagement or transaction history. Provides the ability to manage and control one’s own personal data without the need for a third-party intermediary or centralized repository. Blockchain provides the potential to truly democratize the sharing and monetization of data and analytics by removing the middleman from facilitating those transactions (potentially acing out those middlemen).



Quote for the day:


"If you are truly a leader, you will help others to not just see themselves as they are, but also what they can become." -- David P. Schloss


Daily Tech Digest - March 10, 2019

Hacking Our Identity: The Emerging Threats from Biometric Technology

Deposit Photos Enhanced By CogWorld
Despite the seemingly enormous potential of biometric technology and its applications, the security it provides seems to be just an illusion due to the complex process, policy and people challenges it brings with it. While it is almost impossible to lose or replace biometrics, the question remains whether biometrics technology is full proof and ready for global implementation. That brings us to an important question: can the evolving biometric system be in itself a complete human identification and authentication system, or it can only be part of an identification system? ... Perhaps the biometric system can only be one part of an overall human identification or authentication process, as there are many other variables and parts of that process that will need to play an equal role in determining identity verification effectiveness. Moreover, since the evolving biometric technologies are vulnerable to errors and are easily tricked and manipulated (by AI), it is important that we evaluate whether the ongoing effort towards human identity authentication gives the decision-makers the level of security they are hoping for.



Why Our Brains Fall for False Expertise, and How to Stop It

The brain uses shortcuts to manage the vast amounts of information that it processes every minute in any given social situation. These shortcuts allow our nonconscious brain to deal with sorting the large volume of data while freeing up capacity in our conscious brain for dealing with whatever cognitive decision making is at hand. This process serves us well in many circumstances, such as having the reflex to, say, duck when someone throws a bottle at our head. But it can be harmful in other circumstances, such as when shortcuts lead us to fall for false expertise. At a cognitive level, the biases that lead us to believe false expertise are similarity (“People like me are better than people who aren’t like me”); experience (“My perceptions of the world must be accurate”); and expedience(“If it feels right, it must be true”). These shortcuts cause us to evaluate people on the basis of proxies — things such as height, extroversion, gender, and other characteristics that don’t matter, rather than more meaningful ones.


How to create a transformational cybersecurity strategy: 3 paths


"There's a fine line between the deeply technical, scientific part of cybersecurity, and the people part, which we spend less time talking about—the stuff that actually enables a sustainable transformation," Budge said. "We've seen how one without the other can fail." A good strategy moves security from an IT issue to one of customer trust, Budge said. It also moves security from a technically-focused discipline to a holistic one, and gives business the freedom to achieve its digital aspirations, rather than acting as a blocking agent, she added. Bad cybersecurity strategies are those that cause companies to miss the breaches they experience, that invest in the wrong areas, that require teams to spend their time responding tactically, and that struggle to attract and retain talent, Budge said. No one silver bullet exists for creating a cybersecurity strategy; each is dependent upon the size of the organization, its cybersecurity maturity, and the level of support in the organization, Budge said.


People Are More Complex Than Computers

Defining what Human Resources looks like and how this functions in a decentralised organisation with more than half of its staff being independent consultants is the first issue. Another realisation that came pretty quickly is that software metaphors can only take you so far. People have feeling, whereas computers don't. In real life, you are always testing in production, there is no "staging environment". In software when you make a mistake you can try again many times, or write an automated test to make sure that the same issue won't happen again. In real life, this is impossible. Balancing freedom versus accountability is hard, as is diversity and inclusion when growing a global, distributed organisation. In short, growth of an organisation is neither linear, nor predictable. What Equal Experts has learned from this process is that bigger is different, and many times you need to dynamically adapt, or "make it up as you go". As long as you strive for continuous improvement, and trust and empower your people, you are setting yourself up for success.


These are the keys to recruiting (and keeping) your Gen Z employees


“Gen Z is 100% digitally native, meaning they are the first job seekers to be born during the age of smartphones, self-service online tools and AI-enabled virtual assistants like Siri and Alexa. They’ve never known a world without the convenience and speed of digital interaction. Much of their time is spent on social media, streaming videos and gaming online,” says Kurt Heikkinen, CEO of candidate engagement and interview software Montage. “As a result, because so much of their world is instant, digital, and seamless, they expect the exact same experience when it comes to job searches and the hiring process. To create the kind of candidate experience that will engage Gen Z and accelerate job offers, explore interviewing technology that gives candidates more choice and control–like automated scheduling, AI-enabled virtual hiring assistants, and on-demand interviews–that offers candidates the high-touch, high-tech experience that they want during their job hunts.”


JP Morgan’s Stablecoin: A Feat of Engineering or Marketing?


JP Morgan’s stablecoin neatly connects the dots between the aspects of settlement and volatility management by providing digital cash that can be used and enabling the ability to redeem the coin at a stable rate. While this may sound like a significant achievement, all JP Morgan’s stablecoin actually provides is the ability for a counterparty to be paid by JP Morgan in exchange to being provided a digital certificate. It is actually the anathema to the idea of creating an ecosystem whereby all participants can utilize a universally accepted and redeemable digital cash. Instead, it is a mechanism where JP Morgan will redeem a token, that it issues on its platform only. This is akin to only being able to buy, gamble and cash in your gambling chips at the Venetian casino. And far from being a technology innovation, this is something that at its most fundamental is old technology masquerading as a new innovation.


How Crypto Company is Fighting Setbacks to Deliver New Technology for Users


ILCoin says that things truly began to change in November 2017, putting the startup in a position where it could start to develop and build foundations for future success. The project says 2018 delivered much more change and positive developments than in the past three years combined, with its team establishing meaningful relationships with exchanges and listing sites. ILCoin says that its newly developed consensus mechanism, C2P, will help deliver levels of security on blockchain that have never been achieved by other projects. After learning harsh lessons during the early stages of its business, the company’s founders are determined to focus on creating sound technology that can make a difference to the public. ILCoin says this is a stark contrast to other companies which have aimed to promote ERC-20 tokens through exaggerated and often slick-looking marketing campaigns, even though the product has little substance.


A Great Engineer Needs the Liberal Arts

Every great developer I've worked with has excellent problem solving skills. I've participated in many technical interviews, on both sides of the table, where the goal wasn't to determine coding ability as much as it was to demonstrate how a person approaches a new problem. In STEM subjects, the scientific method is often employed as a logical set of analytical steps. ... It may be easy to forget that the process begins with asking a question and doing background research, and ends with communicating your findings. Coming up with a question, determining if it is the right question to ask, and doing background research, all require critical thinking skills which are the focus of the liberal arts. Effectively reporting your findings comes back to knowing your audience. If you wrote a simple prototype application to test performance improvement, how would you communicate the results to your non-technical product owner? Showing the raw code is probably no more helpful than writing a 50-page report.


How to Build an Enterprise Architecture Roadmap: 4 Styles to Consider

How to build an Enterprise Architecture Roadmap
Of course, not everyone wants to go to Hawaii for sun and seafood. Perhaps snow and schnapps in the Swiss Alps is more your style? Similarly, each business has a different destination in mind and a different set of core metrics to guide them there. If what you primarily care about is using high-cost resources efficiently you will need quite a different set of roadmapping priorities to an FMCG company, which is primarily focused on time-to-market. In ABACUS you can set any number of goals and then use the tool’s powerful analytics to quantitatively assess potential options. This includes out-of-the-box algorithms using equational, structural, discrete-event and Monte-Carlo techniques. You can run these using a range of metrics including Financial (e.g. TCO, ROI, NPV), Technical (e.g. resource utilization, response times, availability, reliability, etc) and Environmental (e.g. carbon footprint, resource re-use, sustainability, heat & power consumption).


Catching Up On The Open Group Open Process Automation Forum

One key element that distinguishes process automation is that it is “always-on.” It’s a non-stop effort. Once the plant stops, the organization stops making money. It’s vitally crucial that the plant keeps operating, hopefully at optimal efficiency. The manufacturer is very much opposed to anything that will cause the plant to shut down because that will result in a direct loss of revenue to the organization. The same applies to other industries beside oil and gas. It applies in pharmaceutical companies as they go through the whole process of generating the products, packaging them and getting them out the door. That has its own challenges in that a lot is based on how quickly you can get to market. Food and beverage is another example. They do a lot of continuous processing where soda, beer, cereals and all kinds of other food stuff is created from raw materials on a continuous processing basis.



Quote for the day:


"Really great people make you feel that you, too, can become great." -- Mark Twain


Daily Tech Digest - March 09, 2019

Misconceptions about the term RPA: would removing a letter from the acronym help?

Misconceptions about the term RPA: would removing a letter from the acronym help? image
Removing the ‘robotic’ term may help to alleviate fears of robots taking over; but according to Jon Clark, proposition development at ActiveOps, it is the word ‘process’ which is the problem. “A process can be very wide-ranging and complex and the type of robots we are seeing automate ‘tasks’ within a ‘process’, so I think the ‘P’ in RPA is part of the problem, not the ‘R’. This is a subtle distinction but creates a challenge in terms of perception,” he says. The process of a credit card application for example, is made up of a series of steps such as checking details, credit scores, updating systems, sending confirmation emails and instructing the card printer. “That’s important because people tend to hear ‘process automation’ and think the whole thing will be automated. Unfortunately, it’s not that simple because robots aren’t yet able to do every task in the process,” he states. However, many within the industry believe that the RPA term should remain, and that changing any of the words could cause more problems that it solves.


Online voting: Now Estonia teaches the world a lesson in electronic elections

Voting online, or i-voting, as it is often called in Estonia, takes place during the advance voting period that runs from the 10th until the fourth day before the election. It is not possible to i-vote on election day. The voting process itself is fairly simple. The voter needs a computer with an internet connection and a national ID card or a mobile ID with valid certificates and PIN codes. Once the voting application is downloaded, the software automatically checks if the voter is eligible to cast a ballot and displays the list of candidates according to the region where the voter is registered. After voters make their decision, the application encrypts their vote and it is securely sent to the vote-collecting server. Every vote receives also a timestamp, so if necessary, it is possible to verify later whether the vote was forwarded to the collecting server. As i-voting doesn't take place in a controlled environment like a polling station, the authorities have to ensure that the vote has been freely cast. So, voters can change their choice during the advance voting period digitally or at a polling station, and then the last vote given is the one that counts.


Triton is the world’s most murderous malware, and it’s spreading


The malware made it possible to take over these systems remotely. Had the intruders disabled or tampered with them, and then used other software to make equipment at the plant malfunction, the consequences could have been catastrophic. Fortunately, a flaw in the code gave the hackers away before they could do any harm. It triggered a response from a safety system in June 2017, which brought the plant to a halt. Then in August, several more systems were tripped, causing another shutdown. The first outage was mistakenly attributed to a mechanical glitch; after the second, the plant's owners called in investigators. The sleuths found the malware, which has since been dubbed “Triton” (or sometimes “Trisis”) for the Triconex safety controller model that it targeted, which is made by Schneider Electric, a French company. In a worst-case scenario, the rogue code could have led to the release of toxic hydrogen sulfide gas or caused explosions, putting lives at risk both at the facility and in the surrounding area.Gutmanis recalls that dealing with the malware at the petrochemical plant, which had been restarted after the second incident, was a nerve-racking experience.


Blockchain marches steadily into global financial transaction networks

Chains of binary data.
SWIFT is among a groundswell of financial services firms testing blockchain as a more efficient and transparent way of conducting cross-border financial transactions, unhampered by much of the regulatory oversight to which current networks must adhere. SWIFT may also be feeling pressure as more and more firms in financial services pilot, or outright adopt, DLT technology. "There is a lot of competition now," said Avivah Litan, Gartner vice president of research. "If you think about SWIFT, it was just a big banking network that moved money quickly and authenticated users, but it costs a lot to do that. And now there are competing initiatives using blockchain." Litan pointed to J.P. Morgan Chase, CLS Group and Ripple, a permissioned blockchain ledger that moves money using a proprietary cryptocurrency, as prime examples of those developing blockchain for cross-border financial transfers. "Ripple is a competitor in the sense that they are trying to set up a bank-to-bank network," Litan said.


GDPR: Still Plenty of Lessons to Learn

GDPR: Still Plenty of Lessons to Learn
During the RSA panel, security expert Ariel Silverstone reported that as of the end of January, there were 41,000 breaches reported under GDPR that fell within the 72-hour notification window. Additionally, there have been about 250 investigations by the various data protection authorities. Silverstone noted that while GDPR involves all 28 countries of the EU, variations in how each country is implementing the law mean companies could face different penalties. For instance, he described that Germany's interpretation of the law makes a violation nearly a criminal case, while other nations have been reducing fines. Silverstone also pointed out that the California Consumer Privacy Act, which adheres to some of the same principals as GDPR, is offering some of the same consumer protections that Europeans now enjoy. Mark Weatherford, the global information security strategist at Booking Holdings, told the audience that while complying with the GDPR rules is difficult, it's not impossible. Before his current job, he worked at a startup that needed to come into compliance.



A Practical Intro to Kotlin Multiplatform

Kotlin has enjoyed an explosion in popularity ever since Google announced first-class support for the language on Android, and Spring Boot 2 offered Kotlin support. You’d be forgiven for thinking that Kotlin only runs on the JVM, but that’s no longer true. Kotlin Multiplatform is an experimental language feature that allows you to run Kotlin in JavaScript, iOS, and native desktop applications, to name but a few. And best of all, it’s possible to share code between all these targets, reducing the amount of time required for development. This blog post will explore the current state of Kotlin Multiplatform by building a simple app that runs on Android, iOS, Browser JS, Java Desktop, and Spring Boot. Maybe in a few years, Kotlin will be a popular choice on all these platforms as well. ... To share Kotlin code between platforms, we’ll create a common module that has a dependency on the Kotlin standard library. For each platform, we’ll support the need to create a separate module that depends on the common module and the appropriate Kotlin language dependency.


How Daimler is using graph database technology in HR


For us, we could see advantages to using graph technology in HR projects because HR data is not isolated, so you don't normally have one person working without a connection to another person. If you look at a company, every time you look at the people working in the company you will see that they all have a connection to other people working in the company, you won't see anybody who is completely isolated. That is one of the reasons why we thought that HR data might be a very good fit with a graph data model. We have started with trying to understand what graph and HR data have in common. ... The second reason, and it's a concrete reason why we created this structured application, is that we created our Leadership 2020 programme at Daimler. We are transforming as a company from the classical, hierarchical structure to a mixture of classic hierarchies and what is called a 'swarm' which is a mixture of the same people working on the same project but coming from different departments and different hierarchies.


Blockchain boosters warn that regulatory uncertainty is harming innovation

Businesses and consumers are reluctant to develop and use blockchain applications in the face of uncertainty over whether they might violate outdated financial laws, the Chamber of Digital Commerce argues in its “National Action Plan” (PDF). Among other things, it calls for “clearly articulated and binding statements from regulators regarding the application of law to blockchain-based applications and tokens.” On Wednesday at the DC Blockchain Summit, SEC commissioner Hester Peirce warned industry advocates to be careful what they wish for. Peirce called the action plan “helpful” and agreed that clear regulatory guidelines are needed. But she cautioned against expecting the government to try to foster innovation, which she said could do more harm than good. Peirce urged patience and cooperation. Regulators are slow, she said, and this technology is complicated: “There’s a learning curve. People at the SEC are trying to learn about this space, and trying to understand where the pressure points are.”


2 reasons a federated database isn’t such a slam-dunk

2 reasons a federated database isn̢۪t such a slam-dunk
First, performance. You can certainly mix data from an object-based database, a relational database, and even unstructured data, using centralized and virtualized metadata-driven view. But your ability to run real-time queries on that data, in a reasonable amount of time, is another story. The dirty little secret about federated database systems (cloud or not) is that unless you’re willing to spend the time it takes to optimize the use of the virtual database, performance issues are likely to pop up that make the use of a federated database, well, useless. By the way, putting the federated database in the cloud won’t help you, even if you add more virtual storage and compute to try to brute-force the performance. The reason is that so much has to happen in the background just to get the data in place from many different databases sources. These issues are fixed typically with figuring out good federated database design, tuning the database, and placing limits on how many physical databases can be involved in a single pattern of access. I’ve found that the limit is typically four or five.


How to use process data mining to improve DevOps

Process mining is the data-driven improvement of business processes, and data scientists often use it to suggest ways to enhance performance. Process data mining works for companies and DevOps teams with processes in place, as well as those that still need to create processes. In the first case, people can compare the best practices for their process with what regularly happens within the team. But, individuals at the enterprise level can also use process data mining to establish their processes. Information sources such as event logs give details about how and when people use tools. Process data mining shows people how far away they are from the target of an ideal process, which can also mean it helps people solidify the processes a DevOps team follows. Then, it’s possible to know how to make the most meaningful process-related improvements and discover the things going wrong. ... Process data mining allows for real-time data collection. The companies that successfully use DevOps rely on release cycle metrics that tell them about progress and quality levels.



Quote for the day:


"Strong convictions precede great actions." -- James Freeman Clarke