Daily Tech Digest - March 12, 2019

The 3 surprising secrets that drive innovation in the digital era

3 lightbulbs with the number three
It's inevitable, hear the word innovation, and you immediately start thinking about technology. After all, innovation and technology have been nearly synonymous for most of the last two decades. This inclination is even more likely if you're an IT professional, given our natural fondness of technology. But if you want to transform your organization into an innovation machine, the place to start is with the recognition that innovation is not, in fact, about technology at all. ... The way Ubels discussed what the company was doing was illuminating. “I love technology, but it’s about building better buildings for the world,” he explained during a subsequent conversation we had on the subject. “It’s healthy, sustainable, the best working environment for employees. There’s a war for talent and a building is an important part of how you express yourself as an organization and a building that people like to go to.” Here was the person responsible for the technology at a company that had made technology a central component to its value proposition — and there was almost no talk about technology either from the keynote stage or during our conversation.



5 steps to performing an effective data security risk assessment

A threat is anything that has the potential to cause harm to the valuable data assets of a business. The threats companies face include natural disasters, power failure, system failure, accidental insider actions (such as accidental deletion of an important file), malicious insider actions (such as a rogue agent gaining membership to a privileged security group), and malicious outsider actions (such as phishing attacks, malware, spoofing, etc.). Each company should have its central risk team determine the most probable threats and plan accordingly. ... A vulnerability is a weakness or gap in a company's network, systems, applications, or even processes which can be exploited to negatively impact the business. Vulnerabilities can be physical in nature (such as old and outdated equipment), they can involve weak system configurations (such as leaving a system unpatched or not following the principle of least privilege), or they can result from awareness issues. Similar to determining threats, analyzing vulnerabilities is also best completed by the central risk team.


The buzz at RSA 2019: Cloud security, network security, managed services and more

The buzz at RSA 2019: Cloud security, network security and more
Remember a few years ago when we were all shocked by dual exhibition floors in Moscone north and south? Well, the RSA conference addressed this by making one contiguous show floor in and between both buildings. Why so many vendors? Because every individual technology in the security technology stack is in play, driven by things like machine learning algorithms, cloud-based resources, automation, managed services components, etc. All these vendors may be a boon to industry trade shows, but they are confusing the heck out of cybersecurity pros. Instead of buzz words and hyperbole, successful vendors will invest in user education and thought leadership, offering guidance and support for customers and prospects. ... Large cybersecurity vendors are jumping on this trend with integrated cybersecurity technology platforms and moving toward enterprise license agreements and subscription-based pricing. Many of the vendors I met with are now tracking multi-product deals and incenting direct sales and distributors in this direction.


Applying Artificial Intelligence in the Agile World

There are a growing number of customer service software products that let you combine your existing knowledge base support with chatbots to provide pre-canned and self learning responses to customer queries. This is a great way to get started with experimenting with self learning capabilities. Recommendation systems as popularised by Netflix’s movie recommendation feature have made significant advancements in recent years. These can be easily integrated into existing systems to add self learning capabilities. For example, collaborative filtering systems can collect and analyze users' behavioral information in the form of their feedback, ratings, preferences, and feature usage. Based on this information, these systems exploit similarities amongst several users to suggest user recommendations. The emergence of operational chatbots as popularised by github’s open source project hubot are changing the traditional operations paradigm. Work that previously happened offline is now being brought into chat rooms using communication tools such as slack.


Cloud monitoring, management tools come up short


Cost and complexity were the top reasons given for cloud-monitoring failures. Forty-five percent said cloud support required additional software licenses or network monitoring tool modules, which they didn’t want to pay for. Forty-four percent indicated that cloud support in their tools was too difficult to implement or use. They simply couldn’t get value out of the updated tools. “Due to complacency and limitations of the software itself, we had to get rid of [a tool],” one IT executive at a North American distributor of heavy, manufactured products told EMA. “It’s not worth the time and investment. We didn’t want to spend more money on a new version that was just a redux of an older version. I didn’t see any real progress in the product.” Furthermore, 35 percent said their vendors had done a poor job of adding cloud-monitoring support to their tools, with the functional updates failing to meet their needs. And 28 percent said their vendors had failed to even establish a roadmap for cloud monitoring.


2019’s Most Inquired Professional Services Marketplace Model

Be it the medical services, freelancing services, travel or hospitality services to name a few. In whichever specifics the services marketplace may be, it’s prime role is to connect the people with service providers. Thumbtack, TaskRabbit, Handy.com and many more service marketplaces are becoming routine names for people. It literally took a good ten years for the customers to warm up with the idea of services marketplaces. With experiencing a lot many varied economic models, the services marketplace industry has undergone several phases of evolution. On this note, it becomes vital for companies to have a killer business model to lead and survive in the competition marathon. A number of businesses have recognized the essential aspects that contribute to design a lucrative business model. This blog gives a firsthand look to these key elements of the professional services marketplace model that’s pretty perfect for a services marketplace.


Get started with natural language processing APIs in cloud


With the popularity of voice assistant technologies, natural language processing APIs and similar services have become one of the most in demand -- and better understood -- subdisciplines of AI. There are decades of research to support the field, and it's used in countless products to analyze speech and text for language and sentiment, improve the ability to search unstructured data and even parse intent from conversations as they happen. Natural language processing has only recently become affordable enough to productize for the general public. Today, it is so commonplace that the major cloud providers -- as well as a number of smaller players -- offer it as a service. Each vendor has its own feature set to process natural, human-readable text. Let's review some of the most prominent natural language processing APIs and cloud-based services, as well as ways developers can incorporate them into applications.


Why CISOs Need Partners for Security Success

More and more CISOs are buying into the strategy of involving members of the C Suite as well as other leaders in key projects, Pescatore said. For instance, CISOs at power plants and other large manufacturing facilities are working with COOs to show how business results are affected when systems are offline due to a ransomware attack or another type of cyberattack, clearly demonstrating why there's a need for better security to improve reliability and resistance in the face of an interruption. ... The security team may not understand the goals of the development team and may lack the skills to keep up with the rapid pace of application development, Pescatore explained. "So the slowdown is really two things," Pescatore told me after his presentation. "The first is not understanding how the business works. It's about saying no to everything when sometimes there's no risk that anyone will care about. The second is skills - the security team might not be up to the task of going as fast as the other side."


How to shop for CDN services

nw how to shop for cdn shopping cart
Content delivery networks are the transparent backbone of the Internet that bring users every piece of content to their PCs or mobile browsers – from news stories to shopping sites to live-streaming video. For more than a decade, a content delivery network’s primary mission has been to reduce latency by shortening the distance between a website’s visitor and its server. Today, however, the stakes are much higher. Skyrocketing streaming demands, growing consumer impatience, spikes in global live viewership, and shifting device preferences are all changing CDN services, according to a study by streaming platform Conviva. Its users’ overall viewing hours increased 89 percent in 2018, including a 165 percent jump in streaming TV viewership in the fourth quarter alone, according to the study. Live content drove much of the surges, including a 217 percent spike in U.S. news watching during November’s mid-term elections.  At the same time, rising expectations about video streaming quality have viewers more impatient than ever.


How AWS, Azure and Google approach service mesh technology


Some users only want service mesh connectivity and load balancing for their microservices. Here, Microsoft users will want to consider Azure Service Fabric. It supports deployment on other public clouds, which makes it the top service mesh for multi-cloud. Also consider Google's Kubernetes Engine and Istio, particularly if you're a Kubernetes shop. Amazon's basic service mesh tools are great for AWS users, but less versatile in multi- and hybrid cloud deployments. The middle ground, where most users will probably find themselves, is a bit more difficult to read at this point. Microsoft and Google have signaled they'll support a fairly portable service mesh vision via Azure Service Fabric and Google's Kubernetes-Istio combination, respectively. Amazon's middle ground is still divided and somewhat primitive compared to its competitors, which likely means more upgrades are on the way. In the long run, service mesh, managed container services and even serverless are likely to converge into a single uniform resource model for applications.



Quote for the day:


"Perhaps the ultimate test of a leader is not what you are able to do in the here and now - but instead what continues to grow long after you're gone" -- Tom Rath


Daily Tech Digest - March 11, 2019

A primer for CIOs needing ‘deep learning’ on the benefits on emerging tech

hands hold a tiny string of lightbulbs / small or emerging ideas / brainstorming / innov
Davenport suggests that CIOs and business leaders need to look at AI through the lens of business capabilities versus the lens of technology. He says that process automation focuses upon the automation of digital and physical tasks using RPA. More importantly, Davenport says RPA is the least expensive and easiest to implement. This is an efficiency, (coupled with consistency and standard) time saving and integral part of any digital process. Davenport contrasts RPA with “cognitive insight” and “cognitive engagement”. Cognitive insight uses algorithms to detect patterns against data sets with vast volumes and with this, interprets their meaning. Insights here are typically provided by machine learning. These supervised or unsupervised approaches are data intensive and detailed. Models are typically trained on a portion of a data set. Models gets better, in other words, make predictions as they use new data. Clearly, the more data the better especially for things like facial recognition. Cognitive Insights often use a version of machine learning, deep learning which attempts to mimic the human brain to recognize patterns.


Cybercrime is increasing and more costly for organizations

Cyber attacks are evolving from the perspective of what they target, how they affect organizations, and the changing methods of attack, according to the study, which is based on interviews with 2,647 senior leaders from 355 companies across 11 countries and 16 industries. Information theft is the most expensive and fastest rising consequence of cyber crime. However, data is not the only target. Core systems such as industrial controls are being hacked in a dangerous trend to disrupt and destroy, the report said. While data remains a key target, theft is not always the outcome of an attack. A new wave of cyber attacks sees data no longer just being copied but being destroyed or changed, in attempts to breed distrust. Attacking data integrity is the next frontier of cyber threats, the report said. Cyber criminals such as hackers are adapting their attack methods. They are aiming at the human layer, which the researchers said is the weakest link in cyber defense, through increased ransomware and phishing and social engineering attacks as a path to entry.


Gen Z: The Challenges and Opportunities with New Talent from a New Generation


Unlimited access to technology and the internet has led Gen Zers into a mindset of hyper customization. Young people today are much less willing to follow a straightforward career path. Gen Zers have seen that only about 27 percent of college graduates are currently working in the field of their major and this has led them into wanting to take a more proactive role in deciding and designing their career path. In fact, the Society for Human Resource Management finds that 56 percent of all Gen Zers say that want to write their own job description. While the dream for Gen Zers might be running their own company as a route to financial security and wellbeing, they will also covet opportunities to innovate and create value for the companies they work for. Lastly, as fully digital natives, the Gen Z workforce will obviously search for companies that offer the most advanced technological developments within their respective workplaces.


How Artificial Intelligence Could Transform Medicine

Dr. Topol believes that A.I. can do more than enhance diagnoses and treatments. It can also save doctors from doing tasks like taking notes and reading scans, allowing them to spend more time connecting with their patients. Recently, we caught up with Dr. Topol to discuss his thoughts on where A.I. has the most potential to improve health care, where it might stumble, and how it could protect doctors from things like burnout and depression. Here are edited excerpts from our interview. ... For the first time we’ve got real-time, objective metrics for state of mind and mood like tone of speech, breathing pattern, smartphone keystrokes and communication, and physical activity. And we’ve learned people would rather share their innermost secrets with an avatar compared with a human being. So, the landscape is ripe for A.I. to help alleviate the profound shortage of health professionals compared with the enormous burden of depression and other mental health conditions.


Deep Learning: When Should You Use It?


True, it is getting easier to use deep learning. Part of this is due to the ubiquity of open source platforms like TensorFlow and PyTorch. Then there is the emergence of cloud-based AI Systems, such as Google’s AutoML. But such things only go so far. “Each neural network model has tens or hundreds of hyperparameters, so turning and optimizing these parameters requires deep knowledge and experiences from human experts,” said Jisheng Wang, who is the head of data science at Mist. “Interpretability is also a big challenge when using deep learning models, especially for enterprise software, which prefers to keep humans in the loop. While deep learning reduces the human effort of feature engineering, it also increases the difficulty for humans to understand and interpret the model. So in certain applications where we require human interaction and feedback for continuous improvement, deep learning may not be the appropriate choice.” However, there are alternatives that may not be as complex, such as traditional machine learning.


Humans Wanted: Robots Need You

Manufacturing and production anticipate the most change: 25% of employers say they will employ more people in the near-term while another 20% say they will employ fewer – resulting in job growth together with significant skills disruption in the industry. Growth will come too in frontline and customerfacing, engineering, and management roles, all of which require human skills such as advanced communication, negotiation, leadership, management and adaptability.8 In other functions, administrative and office roles are shrinking and overall HR headcount is expected to stay the same. ... Demand for tech and digital skills is growing across all functions10 yet employers place increasing value on human skills as automation scales and machines prove better at routine tasks. While 38% of organizations say it is difficult to train in-demand technical skills, 43% said it is even harder to teach the soft skills they need such as analytical thinking and communication.


Citrix breach once again highlights password weaknesses


In a statement, Citrix said it has taken action to contain the breach, begun a forensic investigation and engaged a cyber security firm to assist. The software firm said it has also taken actions to secure its internal network. “While our investigation is ongoing, based on what we know to date, it appears that the hackers may have accessed and downloaded business documents. The specific documents that may have been accessed, however, are currently unknown. At this time, there is no indication that the security of any Citrix product or service was compromised,” the statement said. The notification, however, has sounded alarm bells for governments and military organisations, as well as the more than 400,000 organisations around the world that use Citrix products and services, raising fears that the their networks may be at risk of compromise. According to security firm Resecurity, Citrix was breached by the Iranian-linked group known as Iridium, which has hit more than 200 government agencies, oil and gas companies and technology companies.


Price transparency in healthcare requires accuracy via better use of technology

It’s the consumer-driven health plans, where patients are now responsible for more. They have to make a decision – “Do I buy my groceries, or do I have an MRI.” The shift in healthcare makes us go after the patient before insurance is paid 100 percent. Patients now have a lot of skin in the game. And they have to start thinking, “Do I really need this procedure, or can it wait?” ... It's actually a tremendous opportunity for technology to help patients andproviders. We live in an experience economy, and in that economy everyone is used to having full transparency. We’re willing to pay for faster service, faster delivery. We have highly personalized experiences. And all of that should be the same in our healthcare experiences. This is what people have come to expect. And that's why, for us, it’s so important to provide personalized, consumer-friendly digital payment options.


What to Look for in an AI Partner

Focus is not always enough. Does your potential partner have the expertise to actually solve your particular problem? Expertise is a complicated issue. Partners need a certain level of domain knowledge. The team assigned to your organization must possess an understanding of your unique pain points and overall business. It doesn’t have to be exhaustive, but every industry is unique in some way, whether it’s in terms of regulations or customer profiles or something else, and if your team is not familiar, it can lead to big problems later. At the same time, deep data science experience is also essential. The models are the foundation of every AI solution. They must be carefully constructed, and now for the super challenging part: They need to be packaged in consumer-grade software and delivered through services that can drive operational impact in a manner applicable to your domain. And expertise does not stop there. Your chosen partner needs to be able to map out a clear path to implementation.


Is Blockchain Enabler of Data Monetization?

A shared, distributed ledger (blockchain) has the following Big Data ramifications: Common access to the same data for all parties involved in the transaction. This should accelerate data acquisition, sharing, data quality, data governance and ultimately, data analytics. Provides a detailed register of all transactions or engagements kept in a single “file” or blockchain. It provides a complete view of the entire transaction, from engagement start to engagement finish. No need to integrate pieces of data from multiple systems in order to create a single view of the entire engagement or transaction history. Provides the ability to manage and control one’s own personal data without the need for a third-party intermediary or centralized repository. Blockchain provides the potential to truly democratize the sharing and monetization of data and analytics by removing the middleman from facilitating those transactions (potentially acing out those middlemen).



Quote for the day:


"If you are truly a leader, you will help others to not just see themselves as they are, but also what they can become." -- David P. Schloss


Daily Tech Digest - March 10, 2019

Hacking Our Identity: The Emerging Threats from Biometric Technology

Deposit Photos Enhanced By CogWorld
Despite the seemingly enormous potential of biometric technology and its applications, the security it provides seems to be just an illusion due to the complex process, policy and people challenges it brings with it. While it is almost impossible to lose or replace biometrics, the question remains whether biometrics technology is full proof and ready for global implementation. That brings us to an important question: can the evolving biometric system be in itself a complete human identification and authentication system, or it can only be part of an identification system? ... Perhaps the biometric system can only be one part of an overall human identification or authentication process, as there are many other variables and parts of that process that will need to play an equal role in determining identity verification effectiveness. Moreover, since the evolving biometric technologies are vulnerable to errors and are easily tricked and manipulated (by AI), it is important that we evaluate whether the ongoing effort towards human identity authentication gives the decision-makers the level of security they are hoping for.



Why Our Brains Fall for False Expertise, and How to Stop It

The brain uses shortcuts to manage the vast amounts of information that it processes every minute in any given social situation. These shortcuts allow our nonconscious brain to deal with sorting the large volume of data while freeing up capacity in our conscious brain for dealing with whatever cognitive decision making is at hand. This process serves us well in many circumstances, such as having the reflex to, say, duck when someone throws a bottle at our head. But it can be harmful in other circumstances, such as when shortcuts lead us to fall for false expertise. At a cognitive level, the biases that lead us to believe false expertise are similarity (“People like me are better than people who aren’t like me”); experience (“My perceptions of the world must be accurate”); and expedience(“If it feels right, it must be true”). These shortcuts cause us to evaluate people on the basis of proxies — things such as height, extroversion, gender, and other characteristics that don’t matter, rather than more meaningful ones.


How to create a transformational cybersecurity strategy: 3 paths


"There's a fine line between the deeply technical, scientific part of cybersecurity, and the people part, which we spend less time talking about—the stuff that actually enables a sustainable transformation," Budge said. "We've seen how one without the other can fail." A good strategy moves security from an IT issue to one of customer trust, Budge said. It also moves security from a technically-focused discipline to a holistic one, and gives business the freedom to achieve its digital aspirations, rather than acting as a blocking agent, she added. Bad cybersecurity strategies are those that cause companies to miss the breaches they experience, that invest in the wrong areas, that require teams to spend their time responding tactically, and that struggle to attract and retain talent, Budge said. No one silver bullet exists for creating a cybersecurity strategy; each is dependent upon the size of the organization, its cybersecurity maturity, and the level of support in the organization, Budge said.


People Are More Complex Than Computers

Defining what Human Resources looks like and how this functions in a decentralised organisation with more than half of its staff being independent consultants is the first issue. Another realisation that came pretty quickly is that software metaphors can only take you so far. People have feeling, whereas computers don't. In real life, you are always testing in production, there is no "staging environment". In software when you make a mistake you can try again many times, or write an automated test to make sure that the same issue won't happen again. In real life, this is impossible. Balancing freedom versus accountability is hard, as is diversity and inclusion when growing a global, distributed organisation. In short, growth of an organisation is neither linear, nor predictable. What Equal Experts has learned from this process is that bigger is different, and many times you need to dynamically adapt, or "make it up as you go". As long as you strive for continuous improvement, and trust and empower your people, you are setting yourself up for success.


These are the keys to recruiting (and keeping) your Gen Z employees


“Gen Z is 100% digitally native, meaning they are the first job seekers to be born during the age of smartphones, self-service online tools and AI-enabled virtual assistants like Siri and Alexa. They’ve never known a world without the convenience and speed of digital interaction. Much of their time is spent on social media, streaming videos and gaming online,” says Kurt Heikkinen, CEO of candidate engagement and interview software Montage. “As a result, because so much of their world is instant, digital, and seamless, they expect the exact same experience when it comes to job searches and the hiring process. To create the kind of candidate experience that will engage Gen Z and accelerate job offers, explore interviewing technology that gives candidates more choice and control–like automated scheduling, AI-enabled virtual hiring assistants, and on-demand interviews–that offers candidates the high-touch, high-tech experience that they want during their job hunts.”


JP Morgan’s Stablecoin: A Feat of Engineering or Marketing?


JP Morgan’s stablecoin neatly connects the dots between the aspects of settlement and volatility management by providing digital cash that can be used and enabling the ability to redeem the coin at a stable rate. While this may sound like a significant achievement, all JP Morgan’s stablecoin actually provides is the ability for a counterparty to be paid by JP Morgan in exchange to being provided a digital certificate. It is actually the anathema to the idea of creating an ecosystem whereby all participants can utilize a universally accepted and redeemable digital cash. Instead, it is a mechanism where JP Morgan will redeem a token, that it issues on its platform only. This is akin to only being able to buy, gamble and cash in your gambling chips at the Venetian casino. And far from being a technology innovation, this is something that at its most fundamental is old technology masquerading as a new innovation.


How Crypto Company is Fighting Setbacks to Deliver New Technology for Users


ILCoin says that things truly began to change in November 2017, putting the startup in a position where it could start to develop and build foundations for future success. The project says 2018 delivered much more change and positive developments than in the past three years combined, with its team establishing meaningful relationships with exchanges and listing sites. ILCoin says that its newly developed consensus mechanism, C2P, will help deliver levels of security on blockchain that have never been achieved by other projects. After learning harsh lessons during the early stages of its business, the company’s founders are determined to focus on creating sound technology that can make a difference to the public. ILCoin says this is a stark contrast to other companies which have aimed to promote ERC-20 tokens through exaggerated and often slick-looking marketing campaigns, even though the product has little substance.


A Great Engineer Needs the Liberal Arts

Every great developer I've worked with has excellent problem solving skills. I've participated in many technical interviews, on both sides of the table, where the goal wasn't to determine coding ability as much as it was to demonstrate how a person approaches a new problem. In STEM subjects, the scientific method is often employed as a logical set of analytical steps. ... It may be easy to forget that the process begins with asking a question and doing background research, and ends with communicating your findings. Coming up with a question, determining if it is the right question to ask, and doing background research, all require critical thinking skills which are the focus of the liberal arts. Effectively reporting your findings comes back to knowing your audience. If you wrote a simple prototype application to test performance improvement, how would you communicate the results to your non-technical product owner? Showing the raw code is probably no more helpful than writing a 50-page report.


How to Build an Enterprise Architecture Roadmap: 4 Styles to Consider

How to build an Enterprise Architecture Roadmap
Of course, not everyone wants to go to Hawaii for sun and seafood. Perhaps snow and schnapps in the Swiss Alps is more your style? Similarly, each business has a different destination in mind and a different set of core metrics to guide them there. If what you primarily care about is using high-cost resources efficiently you will need quite a different set of roadmapping priorities to an FMCG company, which is primarily focused on time-to-market. In ABACUS you can set any number of goals and then use the tool’s powerful analytics to quantitatively assess potential options. This includes out-of-the-box algorithms using equational, structural, discrete-event and Monte-Carlo techniques. You can run these using a range of metrics including Financial (e.g. TCO, ROI, NPV), Technical (e.g. resource utilization, response times, availability, reliability, etc) and Environmental (e.g. carbon footprint, resource re-use, sustainability, heat & power consumption).


Catching Up On The Open Group Open Process Automation Forum

One key element that distinguishes process automation is that it is “always-on.” It’s a non-stop effort. Once the plant stops, the organization stops making money. It’s vitally crucial that the plant keeps operating, hopefully at optimal efficiency. The manufacturer is very much opposed to anything that will cause the plant to shut down because that will result in a direct loss of revenue to the organization. The same applies to other industries beside oil and gas. It applies in pharmaceutical companies as they go through the whole process of generating the products, packaging them and getting them out the door. That has its own challenges in that a lot is based on how quickly you can get to market. Food and beverage is another example. They do a lot of continuous processing where soda, beer, cereals and all kinds of other food stuff is created from raw materials on a continuous processing basis.



Quote for the day:


"Really great people make you feel that you, too, can become great." -- Mark Twain


Daily Tech Digest - March 09, 2019

Misconceptions about the term RPA: would removing a letter from the acronym help?

Misconceptions about the term RPA: would removing a letter from the acronym help? image
Removing the ‘robotic’ term may help to alleviate fears of robots taking over; but according to Jon Clark, proposition development at ActiveOps, it is the word ‘process’ which is the problem. “A process can be very wide-ranging and complex and the type of robots we are seeing automate ‘tasks’ within a ‘process’, so I think the ‘P’ in RPA is part of the problem, not the ‘R’. This is a subtle distinction but creates a challenge in terms of perception,” he says. The process of a credit card application for example, is made up of a series of steps such as checking details, credit scores, updating systems, sending confirmation emails and instructing the card printer. “That’s important because people tend to hear ‘process automation’ and think the whole thing will be automated. Unfortunately, it’s not that simple because robots aren’t yet able to do every task in the process,” he states. However, many within the industry believe that the RPA term should remain, and that changing any of the words could cause more problems that it solves.


Online voting: Now Estonia teaches the world a lesson in electronic elections

Voting online, or i-voting, as it is often called in Estonia, takes place during the advance voting period that runs from the 10th until the fourth day before the election. It is not possible to i-vote on election day. The voting process itself is fairly simple. The voter needs a computer with an internet connection and a national ID card or a mobile ID with valid certificates and PIN codes. Once the voting application is downloaded, the software automatically checks if the voter is eligible to cast a ballot and displays the list of candidates according to the region where the voter is registered. After voters make their decision, the application encrypts their vote and it is securely sent to the vote-collecting server. Every vote receives also a timestamp, so if necessary, it is possible to verify later whether the vote was forwarded to the collecting server. As i-voting doesn't take place in a controlled environment like a polling station, the authorities have to ensure that the vote has been freely cast. So, voters can change their choice during the advance voting period digitally or at a polling station, and then the last vote given is the one that counts.


Triton is the world’s most murderous malware, and it’s spreading


The malware made it possible to take over these systems remotely. Had the intruders disabled or tampered with them, and then used other software to make equipment at the plant malfunction, the consequences could have been catastrophic. Fortunately, a flaw in the code gave the hackers away before they could do any harm. It triggered a response from a safety system in June 2017, which brought the plant to a halt. Then in August, several more systems were tripped, causing another shutdown. The first outage was mistakenly attributed to a mechanical glitch; after the second, the plant's owners called in investigators. The sleuths found the malware, which has since been dubbed “Triton” (or sometimes “Trisis”) for the Triconex safety controller model that it targeted, which is made by Schneider Electric, a French company. In a worst-case scenario, the rogue code could have led to the release of toxic hydrogen sulfide gas or caused explosions, putting lives at risk both at the facility and in the surrounding area.Gutmanis recalls that dealing with the malware at the petrochemical plant, which had been restarted after the second incident, was a nerve-racking experience.


Blockchain marches steadily into global financial transaction networks

Chains of binary data.
SWIFT is among a groundswell of financial services firms testing blockchain as a more efficient and transparent way of conducting cross-border financial transactions, unhampered by much of the regulatory oversight to which current networks must adhere. SWIFT may also be feeling pressure as more and more firms in financial services pilot, or outright adopt, DLT technology. "There is a lot of competition now," said Avivah Litan, Gartner vice president of research. "If you think about SWIFT, it was just a big banking network that moved money quickly and authenticated users, but it costs a lot to do that. And now there are competing initiatives using blockchain." Litan pointed to J.P. Morgan Chase, CLS Group and Ripple, a permissioned blockchain ledger that moves money using a proprietary cryptocurrency, as prime examples of those developing blockchain for cross-border financial transfers. "Ripple is a competitor in the sense that they are trying to set up a bank-to-bank network," Litan said.


GDPR: Still Plenty of Lessons to Learn

GDPR: Still Plenty of Lessons to Learn
During the RSA panel, security expert Ariel Silverstone reported that as of the end of January, there were 41,000 breaches reported under GDPR that fell within the 72-hour notification window. Additionally, there have been about 250 investigations by the various data protection authorities. Silverstone noted that while GDPR involves all 28 countries of the EU, variations in how each country is implementing the law mean companies could face different penalties. For instance, he described that Germany's interpretation of the law makes a violation nearly a criminal case, while other nations have been reducing fines. Silverstone also pointed out that the California Consumer Privacy Act, which adheres to some of the same principals as GDPR, is offering some of the same consumer protections that Europeans now enjoy. Mark Weatherford, the global information security strategist at Booking Holdings, told the audience that while complying with the GDPR rules is difficult, it's not impossible. Before his current job, he worked at a startup that needed to come into compliance.



A Practical Intro to Kotlin Multiplatform

Kotlin has enjoyed an explosion in popularity ever since Google announced first-class support for the language on Android, and Spring Boot 2 offered Kotlin support. You’d be forgiven for thinking that Kotlin only runs on the JVM, but that’s no longer true. Kotlin Multiplatform is an experimental language feature that allows you to run Kotlin in JavaScript, iOS, and native desktop applications, to name but a few. And best of all, it’s possible to share code between all these targets, reducing the amount of time required for development. This blog post will explore the current state of Kotlin Multiplatform by building a simple app that runs on Android, iOS, Browser JS, Java Desktop, and Spring Boot. Maybe in a few years, Kotlin will be a popular choice on all these platforms as well. ... To share Kotlin code between platforms, we’ll create a common module that has a dependency on the Kotlin standard library. For each platform, we’ll support the need to create a separate module that depends on the common module and the appropriate Kotlin language dependency.


How Daimler is using graph database technology in HR


For us, we could see advantages to using graph technology in HR projects because HR data is not isolated, so you don't normally have one person working without a connection to another person. If you look at a company, every time you look at the people working in the company you will see that they all have a connection to other people working in the company, you won't see anybody who is completely isolated. That is one of the reasons why we thought that HR data might be a very good fit with a graph data model. We have started with trying to understand what graph and HR data have in common. ... The second reason, and it's a concrete reason why we created this structured application, is that we created our Leadership 2020 programme at Daimler. We are transforming as a company from the classical, hierarchical structure to a mixture of classic hierarchies and what is called a 'swarm' which is a mixture of the same people working on the same project but coming from different departments and different hierarchies.


Blockchain boosters warn that regulatory uncertainty is harming innovation

Businesses and consumers are reluctant to develop and use blockchain applications in the face of uncertainty over whether they might violate outdated financial laws, the Chamber of Digital Commerce argues in its “National Action Plan” (PDF). Among other things, it calls for “clearly articulated and binding statements from regulators regarding the application of law to blockchain-based applications and tokens.” On Wednesday at the DC Blockchain Summit, SEC commissioner Hester Peirce warned industry advocates to be careful what they wish for. Peirce called the action plan “helpful” and agreed that clear regulatory guidelines are needed. But she cautioned against expecting the government to try to foster innovation, which she said could do more harm than good. Peirce urged patience and cooperation. Regulators are slow, she said, and this technology is complicated: “There’s a learning curve. People at the SEC are trying to learn about this space, and trying to understand where the pressure points are.”


2 reasons a federated database isn’t such a slam-dunk

2 reasons a federated database isn̢۪t such a slam-dunk
First, performance. You can certainly mix data from an object-based database, a relational database, and even unstructured data, using centralized and virtualized metadata-driven view. But your ability to run real-time queries on that data, in a reasonable amount of time, is another story. The dirty little secret about federated database systems (cloud or not) is that unless you’re willing to spend the time it takes to optimize the use of the virtual database, performance issues are likely to pop up that make the use of a federated database, well, useless. By the way, putting the federated database in the cloud won’t help you, even if you add more virtual storage and compute to try to brute-force the performance. The reason is that so much has to happen in the background just to get the data in place from many different databases sources. These issues are fixed typically with figuring out good federated database design, tuning the database, and placing limits on how many physical databases can be involved in a single pattern of access. I’ve found that the limit is typically four or five.


How to use process data mining to improve DevOps

Process mining is the data-driven improvement of business processes, and data scientists often use it to suggest ways to enhance performance. Process data mining works for companies and DevOps teams with processes in place, as well as those that still need to create processes. In the first case, people can compare the best practices for their process with what regularly happens within the team. But, individuals at the enterprise level can also use process data mining to establish their processes. Information sources such as event logs give details about how and when people use tools. Process data mining shows people how far away they are from the target of an ideal process, which can also mean it helps people solidify the processes a DevOps team follows. Then, it’s possible to know how to make the most meaningful process-related improvements and discover the things going wrong. ... Process data mining allows for real-time data collection. The companies that successfully use DevOps rely on release cycle metrics that tell them about progress and quality levels.



Quote for the day:


"Strong convictions precede great actions." -- James Freeman Clarke


Daily Tech Digest - March 08, 2019

20190307birdalison.jpg
What we see is a bigger and bigger push to not just protect data, but demands to protect identity. That was always a expected quantity inside of companies on our own infrastructures in our own data centers because we needed to protect data and assets of value. But now this is being extended to an expectation for customers that we are doing business with to have securitized access. And this is such a big leap. It's only come into being in the last couple of years with regulations around data protection and privacy, that we need to once again make sure that that customer is who they say they are, in order to be able to ensure the privacy of that data. And this is causing a tremendous disruption in the marketplace, if not from a solution standpoint, it is definitely causing a disruption relative to thinking about architecture, thinking about how security is designed. We were originally designed to protect assets and we have firewalls and perimeters.


What is Big Data and why does it matter for business?

what is Big Data in business
The misuse and mishandling of personal data is currently a hot topic, thanks in large part to the scandal involving Facebook and Cambridge Analytica. Increased regulation around the storage and processing of data is highly likely – indeed, it is already underway in Europe in the form of the General Data Protection Regulation (GDPR), which came into force in May 2018. Many technology areas are reliant on large data sets and any restrictions on their ability to use them could have significant consequences for future growth. ... Within vertical markets such as retail, where a sale can be won or lost in a matter of moments, there is no other way to make the necessary rapid-fire decisions, such as which offer to display for a specific customer as he or she enters a store. These decisions cannot wait for such transient events to be uploaded to the company’s cloud, so cloud providers such as Microsoft are revamping their own platforms to push critical analytics functions, such as predictive artificial intelligence (AI) algorithms, downstream to devices.


Marriott CEO shares post-mortem on last year's hack


"As part of our investigation into the alert, we learned that the individual whose credentials were used had not actually made the query," Sorenson said. At that point, the Marriott staff realized they were dealing with a probable breach, although they didn't know if it was something big or just the beginning of a hack that could be very easily contained before the attackers accessed any user data. The company said it brought in third-party forensic investigators on September 10, to help its IT staff look into a possible breach. The forensic firm's rummaging uncovered malware on the Starwood IT systems less than a week later. "The investigators uncovered a Remote Access Trojan ('RAT'), a form of malware that allows an attacker to covertly access, surveil, and even gain control over a computer. I was notified of the ongoing investigation that day, and our Board was notified the following day," the CEO said. Uncovering the full scope of the attack took significant forensic work, the CEO said. 


Gartner on futurology and the year 2035

Gartner on futurology and the year 2035: Technologists can be pragmatic about futurism, but there is a need for us all to speak up image
One definition, explained Frank Buytendijk, is ‘futurism is about postulating possible probable and preferable futures in order to prepare for them.’ But that implies that our role is quite passive — we sit back and wait. He prefers futurology or futurism as ‘the art and science of being able to take responsibility for the long term consequences of actions and decisions today.’ That’s an important definition. It implies we have a responsibility — we can shape and mould the future into an image we might prefer. So he asks the question: “How can we be pragmatic futurists?” Part of the problem is that our view of what the future is can be distorted by the prism of the present. Maybe our futuristic view is framed by rose tinted crystal balls. Maybe it is distorted by whatever is in fashion at any moment. When Gartner asked for a view of the year 2035 five years or so ago, privacy was an overriding theme. In its latest survey, privacy not so much, AI was either mentioned directly or by implication.


How to improve Apache server security by limiting the information it reveals

apachehero.jpg
If you administer the Apache web server, you know there are quite a lot of things you can do to help improve its security. For example, you could (and should) employ mod_security. You could also hide directory folders, run only necessary modules, limit large requests, restrict browsing to specific directories, and so much more. But there's two, often-overlooked, steps you can take to help give your Apache server a bit more security: Turning off the Apache signature and configure ServerTokens. Why does this help? Simple. If you broadcast your server's specific information, you would be informing potential malicious actors what they're up against. They could know what web server you're using, what version of the web server, the hosting platform, and even more. You don't want that information displayed for all to see. So, how do you obfuscate that information? There are two options to be configured, and I'm going to show you exactly how to set them, so to hide away your server details. ... 


Where container infrastructure and management investments yield ROI


The ecosystem of infrastructure, services, tools and expertise listed above turns a simple workload isolation technology into a scalable production platform for multiple applications, batch jobs and microservices. To assess the return on investment for these Capex and Opex charges, review the capabilities each provides. ... Meta-management products appeal to organizations with production containerized application experience, whether on premises or via a cloud service, that now want to standardize on container infrastructure and possibly a PaaS development platform. Within this category of tools is a range of subcategories. Organizations can turn to infrastructure management suites, such as HashiCorp Terraform and Consul, Joyent Triton, Rancher and Mesosphere. Alternatively, PaaS offerings that do the job include Pivotal Cloud Foundry, Red Hat OpenShift and Atos powered by Apprenda.


How to determine if Wi-Fi 6 is right for you

How to determine if Wi-Fi 6 is right for you
There’s a lot of hype around the next Wi-Fi standard, 802.11ax, more commonly known as Wi-Fi 6. Often new technologies are built up by the vendors as being the “next big thing” and then flop because they don’t live up to expectations. In the case of Wi-Fi 6, however, the fervor is warranted because it is the first Wi-Fi standard that has been designed with the premise that Wi-Fi is the primary connection for devices rather than a network of convenience. Wi-Fi 6 is loaded with new features, such as Orthogonal Frequency Division Multiple Access (OFDMA), 1024-QAM (quadrature amplitude modulation) encoding and target wake time (TWT), that make Wi-Fi faster and less congested. Many of these enhancements came from the world of LTE and 4G, which solved many of these challenges long ago. These new features will lead to a better mobile experience and longer client battery life, and they will open the door to a wide range of new applications that could not have been done on Wi-Fi before. For example, an architect could now use virtual reality (VR) over Wi-Fi to showcase a house.


It's just a graph, making gravitational waves in the real world

maxresdefaultjpg.jpg
The combination of JSON-LD and schema.org has probably done more to spread the use of RDF than anything else. Just getting Google and other search engines to adopt it has lead to an array of use cases. And yet, JSON-LD was hugely controversial in its time in the RDF community. This was not the last controversy the RDF community faced, but it seems like JSON-LD's success may have had something to teach. But we'll get back to that shortly. Property graphs have been around for about 10 years, and have been driven by the industry. As such, you could say they are a reversed mirror image of RDF: Pragmatism rules, tooling is abundant and easy to use, outreach and community building are a top priority, but standardization only came as an afterthought at this point. Most property graph solutions do not have a schema, or have a very basic schema. Just getting data in and out of property graph solutions is an exercise in patience and improvisation -- good luck representing a graph structure in CSV, and mapping that from solution to solution.


Extracting value from data: how to do it and the obstacles to overcome

Extracting value from data: how to do it and the obstacles to overcome image
The most significant obstacle for information sharing exchanges, is whether the law or regulation will allow it. According to the survey, 33% of respondents said they would be unable to adjust to new regulations effectively for data protection and privacy. With certain business models this information sharing would not be easily achievable — under GDPR or CCPA (California Consumer Privacy Act) — unless with the explicit consent of the consumer. ... “A lot of this has to do with how companies are organised: 31% said we are organisationally siloed — the data that belongs to one business unit is locked up in that business unit, it is not shared with other business units — so they’re not getting the full value of their data, just because of the structure,” he says. Another interesting result from the survey was that 30% lack the data scientists or analytical talent, who would have the capabilities to better exploit the data. “So, there’s definitely a talent shortage leaving money on the table for companies,” confirms Cline.


How blockchain will manage networks

How blockchain will manage networks
Smart Packet Contracts would protect the network from intrusions and so on, and a “Marconi Pipe” would be the channel. It provides the routing and processing. While it’s actually at the data-link OSI Layer 2 (switches, bridges in terms of hardware; and MAC and Ethernet in terms of protocols), it can also overlay on other infrastructure, such as wireless. A barter system, where network resources can be traded for compute resources, say, rounds out the concept. Monitization could indeed be introduced. Another angle is securing the multiple cloud-based systems running in enterprise. It’s a “challenge to make sure [multi-cloud] communication is secure and safe from attacks such as eavesdropping or ‘man in the middle,'" Jong Kim, chief architect of Marconi Foundation and Network World contributor, said in a VentureBeat article in January. “A common network where each connection point securely peers with every other point, regardless of cloud provider or container instance” could be provided with an Ethernet-layer blockchain.



Quote for the day:


Challenges in life always seek leaders and leaders seek challenges. -- Wayde Goodall


Daily Tech Digest - March 07, 2019

Why Wi-Fi needs artificial intelligence
Over time, I expect AI to lead to fully autonomous networks where the AI runs the wired and wireless network. However, I don’t expect businesses to embrace the concept of a “self-driving network” immediately. Instead, the initial wave of AI as a network management tool will be to assist the engineer by providing recommendations coupled with automated basic tasks, including troubleshooting and a problem avoidance. Engineers shouldn’t fear AI or worry about the technology replacing them. Instead, they should look at it as their best friend because it will free up huge amounts of time, as much of the heavy lifting will be done by machines. The access edge, particularly the wireless network, is growing in importance. But at the same time, it is being pushed to do more because more devices are connecting to it, resulting in orders of magnitude more data traversing the network. Manual operational methods have never worked and certainly will not work in a hyper-connected world. AI-based systems are becoming mandatory to keep the performance of Wi-Fi high and to shed the reputation that flaky Wi-Fi is the norm.



5 trends driving the design of next-generation data centers


The efficiency of data centers is both an environmental concern and a large-scale economic issue for operators. Enterprises in diverse industries from automotive design to financial forecasting are implementing and relying on machine-learning in their applications, which results in more expensive and high-temperature data center infrastructure. It’s widely known that power and cooling represent the biggest costs that data center owners have to contend with, but new technologies are emerging to combat this threat. ... One of the most successful technologies that data center operators have put into practice to improve efficiency is monitoring software that implements the critical advances made in machine learning and artificial intelligence. Machines are much more capable of reading and predicting the needs of data centers second to second than their human counterparts, and with their assistance operators can manipulate cooling solutions and power usage in order to dramatically increase energy efficiency.



“When you are in a disaster recovery situation, you do not want the new person trying out the wings,” says Bruce Beam, chief information officer at (ISC)². Unfortunately, the number of cyber security positions outweighs the number of available cyber security professionals. The demand for cyber security professionals has outpaced supply in recent years, due to emerging threats and organisations increasing the amount of business they conduct online. According to a study, the number of organisations that reported shortages in the cyber security skills of their staff has increased over the past four years. In 2014, approximately 23% of organisations indicated this was a challenge, but this has now risen to more than 50%. Much of this rise has been due to the increasing workload of cyber security teams. Continuing professional development (CPD) has been used to ensure that skills remain relevant. 


Open Source Benefits to Innovation and Organizational Agility

To understand how organizations use open source today, Andrew Aitken presented the state of open source in the context of its evolution from the founders until today. Aitken identified four generations. Generation one, initiated in the early 70s, is represented by the evangelists and thought leaders who founded the open source movement, Richard Stallman, Linus Torvalds, Eric Raymond, etc. Their purpose was to make software free to allow anybody to contribute to their improvement. Generation two consists of influencers, such as Marc Fleury, Marten Mickos, Larry Augustin, who began to think about how to commercialize open source and launched the first few commercial open source companies. Generation three of open source started with the proliferation of the internet and the vast amount of data that became available to organizations. Dotcoms created new technologies to manage data and started open-sourcing their software. 


"If the insurer knows our drivers are always driving well on safer routes, then we might be able to bring down our premium," says Gifford. "So, there's opportunities like that when it comes to using blockchain — and that's just an example. But success in blockchain is all about getting partners on board." Gifford says effective partnerships are critical to Wincanton's broader development efforts. The firm launched an innovation programme called W² Labs last March, which gets startups to develop innovative solutions to the firm's challenges. Wincanton also uses its internal development team and works with external consultants, such as IBM and PA Consulting. The broader aims of these combined efforts is to produce what Gifford refers to as the Internet of Transport. These developments focus on three key areas. First, Winsight, an app that enables a paperless cab, so all the paper lorry drivers normally carry, such as routes and proof of delivery, is wrapped up into a single piece of software on a smart device.


"DevOps Institute is thrilled to share the research findings that will help businesses and the IT community understand the requisite skills IT practitioners need to meet the growing demand for T-shaped professionals," said Jayne Groll, CEO of DevOps Institute. "By identifying skill sets needed to advance the human side of DevOps, we can nurture the development of the T-shaped professional that is being driven by the requirement for speed, agility and quality software from the business." Automation, process, and soft skills were the top three most important skills categories, according to the report. Soft skills—including collaboration and cooperation, problem-solving, interpersonal skills, and sharing and knowledge transfer—are equally important as technical skills to DevOps practitioners, highlighting the need for well-rounded candidates in this field. "The reality of the DevOps world is one that is frequently changing," Erin Lovern, director of global talent acquisition at CloudBees, said in the report.


IoT Expands the Botnet Universe

Botnets comprised of vulnerable IoT devices, combined with widely available DDoS-as-a-Service tools and anonymous payment mechanisms, have pushed denial-of-service attacks to record-breaking volumes. At the same time, new domains such as cryptomining and credentials theft offer more opportunities for hacktivism. ... A new piece of malware that takes advantage of Android-based devices exposing debug capabilities to the internet. It leverages scanning code from Mirai. When a remote host exposes its Android Debug Bridge (ADB) control port, any Android emulator on the internet has full install, start, reboot and root shell access without authentication. Part of the malware includes Monero cryptocurrency miners (xmrig binaries), which are executing on the infected devices. Radware’s automated trend analysis algorithms detected a significant increase in activity against port 5555, both in the number of hits and in the number of distinct IPs. 


Clearer North Korean link to global infrastructure malware campaign


The researchers were able to get a rare look at the workings of a nation state cyber espionage campaign after being handed a command and control server for the campaign by one of the government’s targeted. This provided an opportunity to conduct a detailed analysis of code and data from the server responsible for the management of the operations, tools and tradecraft behind the campaign, previously thought to have run from October to November 2018. The analysis led to the identification of several previously unknown command-and-control centres and indicates that Sharpshooter began as early as September 2017, targeted a broader set of organisations in more industries and countries, and that it is currently ongoing. “McAfee Advanced Threat Research analysis of the command-and-control server’s code and data provides greater insight into how the perpetrators behind Sharpshooter developed and configured control infrastructure, how they distributed the malware, and how they stealthily tested campaigns prior to launch,” said Raj Samani


Cisco uncorks 26 security patches for switches, firewalls

network security lock padlock breach
While the 26 alerts describe vulnerabilities that have a Security Impact Rating of “High,” most –23 – affect Cisco NX-OS software, and the remaining three involve both software packages. The vulnerabilities span a number of problems that would let an attacker gain unauthorized access, gain elevated privileges, execute arbitrary commands, escape the restricted shell, bypass the system image verification checks or cause denial of service (DoS) conditions, Cisco said. It has released software fixes for all the vulnerabilities, and none of the problems affect Cisco IOS software or Cisco IOS XE software, the company said. Information about which Cisco FXOS Software and Cisco NX-OS Software releases are vulnerable and what to do about it is available in the fixed software section of the advisory. ... A couple vulnerabilities in the Nexus software could let attackers gain elevated privileges on the switches and execute nefarious commands. The first weakness is due to an incorrect authorization check of user accounts and their associated group ID, Cisco wrote.


Artificial intelligence and cybersecurity: Attacking and defending

Social engineering remains one of the most common attack vectors. How often is malware introduced in systems when someone just clicks on an innocent-looking link? The fact is, to entice the victim to click on that link, quite a bit of effort is required. Historically, it’s been labor-intensive to craft a believable phishing email. Days and sometimes weeks of research, and the right opportunity, were required to successfully carry out such an attack. Things are changing with the advent of AI in cyber. Analyzing large data sets helps attackers prioritize their victims based on online behavior and estimated wealth. Predictive models can go further and determine willingness to pay the ransom based on historical data, and even adjust the size of pay-out to maximize the chances and, therefore, revenue for cybercriminals. Imagine all the data available in the public domain, as well as previously leaked secrets, through various data breaches are now combined for the ultimate victim profiling in a matter of seconds with no human effort.



Quote for the day:


"Leaders keep their eyes on the horizon, not just on the bottom line." -- Warren G. Bennis