Daily Tech Digest - July 07, 2017

Algorithmic Decomposition Versus Object-Oriented Decomposition

The principal advantage of object-oriented decomposition is that it encourages the reuse of software. This results in flexible software systems that can evolve as system requirements change. It allows a programmer to use object-oriented programming languages effectively. Object-oriented decomposition is also more intuitive than algorithm-oriented decomposition because objects naturally model entities in the application domain. Object-oriented design is a design strategy where system designers think in terms of ‘things’ instead of operations or functions. The executing system is made up of interacting objects that maintain their own local state and provide operations on that state information. They hide information about the representation of the state and hence limit access to it. An object-oriented design process involves


Virtual Panel: High Performance Application in .NET

Microsoft has made quite a few investments in platform performance. To cite some examples: .NET Native was introduced a few years ago to improve startup times and reduce memory usage for client apps; .NET 4.5 and 4.6 saw important improvements to the scalability of the garbage collector; .NET 4.6 had a revamped JIT compiler with support for vectorization; C# 7 introduced ref locals and ref returns, which are features designed to allow for better performance on the language level. All in all, it would probably be easier for me personally to write a small high-performance application in a lower-level language like C or C++. However, introducing unmanaged code into an existing system, or developing a large codebase with these languages, is not a decision to be taken lightly. It makes sense to bet on .NET for certain kinds of high-performance systems, as long as you are aware of the challenges and have the tools to solve them on the code level


Medical devices at risk: 5 capabilities that invite danger

Chris Camejo, director of product management, threat intelligence at NTT Security, noted that most medical devices in use today would be secure, “only in a closed, trusted environment without any potentially malicious activity." “Unfortunately a hospital network can't be considered trusted, as it is connected to the internet and contains thousands of internal users, any one of whom could click on the wrong link or download the wrong attachment,” he said. Still, debate continues about how imminent is the risk of physical harm. Jay Radcliffe, a medical device security expert and Type-One diabetic, famously said at the 2014 Black Hat conference that it would be far more likely for, “an attacker to sneak up behind me and deliver a fatal blow to my head with a baseball bat,” than to be harmed by a cyber attack.


Artificial Stupidity: Learning To Trust Artificial Intelligence (Sometimes)

While deep learning AI can surprise its human users with flashes of brilliance — or stupidity — deterministic software always produces the same output from a given input. “Machine learning cannot be verified and certified,” Cherepinsky said. “Some algorithms we chose not to use… even though they work on the surface, they’re not certifiable, verifiable, and testable.” Sikorsky has used some deep learning algorithms in its flying laboratory, Cherepinsky said, and he’s far from giving up on the technology, but he doesn’t think it’s ready for real world use: “The current state of the art (is) they’re not explainable yet.” ... “You see in artificial intelligence an increasing trend towards lifelike agents and a demand for those agents, like Siri, Cortana, and Alexa, to be more emotionally responsive, to be more nuanced in ways that are human-like,” David Hanson, CEO of Hong Kong-based Hanson Robotics, told the Johns Hopkins conference.


Experts Predict When Artificial Intelligence Will Exceed Human Performance

The experts predict that AI will outperform humans in the next 10 years in tasks such as translating languages (by 2024), writing high school essays (by 2026), and driving trucks (by 2027). But many other tasks will take much longer for machines to master. AI won’t be better than humans at working in retail until 2031, able to write a bestselling book until 2049, or capable of working as a surgeon until 2053. The experts are far from infallible. They predicted that AI would be better than humans at Go by about 2027. (This was in 2015, remember.) In fact, Google’s DeepMind subsidiary has already developed an artificial intelligence capable of beating the best humans. That took two years rather than 12. It’s easy to think that this gives the lie to these predictions.


Major cyber incidents accelerating, says NCSC

“This increase in major attacks is mainly being driven by the fact that cyber attack tools are becoming more readily available, in combination with a growing willingness to use them,” he told The Cyber Security Summit in London. Although the WannaCryransomware attacks in May 2017 came very close, Noble said there had been no C1-level national cyber security incidents to date. The majority of the major incidents the NCSC has dealt with were C3-level attacks, typically confined to single organisations. These account for 451 incidents to date. The remaining 29 major incidents were C2-level attacks, significant attacks that typically require a cross-government response. Across these nearly 500 incidents, Noble said there were five common themes or lessons to be learned.


Learning To Wear Your Intelligence: How To Apply AI In Wearables and IoT

Kurzweil claims that machines will pass the Turing AI test by 2029, and that around 2045, “the pace of change will be so astonishingly quick that we won’t be able to keep up, unless we enhance our own intelligence by merging with the intelligent machines we are creating”. He further claims that humans will be a hybrid of biological and non-biological intelligence that becomes increasingly dominated by its non-biological component. Kurzweil envisions nanobots inside our bodies that fight against infections and cancer, replace organs, and improve memory and cognitive abilities. ... The artificial general intelligence (AGI) or strong AI community, though varying widely in timeframe to reach singularity, are in consensus that it’s plausible, with most mainstream AI researchers doubting that progress will be rapid.


A Tour of Recurrent Neural Network Algorithms for Deep Learning

Recurrent neural networks, or RNNs, are a type of artificial neural network that add additional weights to the network to create cycles in the network graph in an effort to maintain an internal state. The promise of adding state to neural networks is that they will be able to explicitly learn and exploit context in sequence prediction problems, such as problems with an order or temporal component. After reading this post, you will know: How top recurrent neural networks used for deep learning work, such as LSTMs, GRUs, and NTMs; How top RNNs relate to the broader study of recurrence in artificial neural networks; and How research in RNNs has led to state-of-the-art performance on a range of challenging problems.


Machine Learning, Artificial Intelligence - And The Future Of Accounting

When accounting software companies eliminated desktop support in favor of cloud-based services, accounting firms were forced to adapt to life in the cloud. Similarly, accounting departments and firms will be forced to adopt machine learning to remain competitive since machines can deliver real-time insights, enhance decision making and catapult efficiency. Rather than eliminate the human workforce in accounting firms, the humans will have new colleagues—machines—who will pair with them to provide more efficient and effective services to clients. Currently, there is no machine replacement for the emotional intelligence requirements of accounting work, but machines can learn to perform redundant, repeatable and oftentimes extremely time-consuming tasks.


Model-Based Software Engineering to Tame the IoT Jungle

The ThingML approach's first goal is to allow abstracting from the heterogeneous platforms and IoT devices to model the desired IoT system's architecture. In practice, platforms and devices, as well as the final distribution of software components, typically aren't known during the early design phases. The architecture model consists of components, ports, connectors, and asynchronous messages. Once the general architecture is defined, our approach allows for specification of the components' business logic in a platform-independent way using statecharts and the action language. ThingML statecharts include composites, parallel regions, and history states. The state machines typically react to events corresponding to incoming messages on a component's port.



Quote for the day:


"Integrity is the soul of leadership! Trust is the engine of leadership! " -- Amine A. Ayad


Daily Tech Digest - July 06, 2017

Connecting 400 million people to the IoT? No sweat.

The first wave of the roll out will connect nearly 2000 communities and covers an estimated 400 million people, making the project the first of its kind in terms of scale. In addition to successful trials in Mumbai, Delhi and Bangalore, the two firms have 35 proof-of-concept applications under trial over the network. Tata are aiming to speed up digital transformation in a variety of sectors including supply chain, utilities and healthcare. “The sheer size of this project is incredible, bringing new services to millions of people,” said David Sliter, vice president and general manager of the Communications Solutions Business at HPE. “Through our partner-centric approach, the HPE Universal IoT platform will enable Tata Communications to build multiple vertical use cases for its IoT network in India on a common platform with a common data model.”


Navigating the AI ethical minefield without getting blown up

As AI, big data, and the related fields of machine learning, deep learning, and computer vision/object recognition rise, buyers and sellers are rushing to include AI in everything, from enterprise CRM to national surveillance programmes. An example of the latter is the FBI’s scheme to record and analyse citizens’ tattoos in order to establish if people who have certain designs inked on their skin are likely to commit crimes*. Such projects should come with the label ‘Because we can’. In such a febrile environment, the risk is that the twin problems of confirmation bias in research and human prejudice in society become an automated pandemic: systems that are designed to tell people exactly what they want to hear; or software that perpetuates profound social problems.


How to manage vendors in a cloud-first world

The first is centralized, where everything -- including contract, performance, financial, risk and relationship management -- is handled centrally by vendor management and procurement professionals. This offers control and consistency, but can be inflexible and slow to react. Another common approach is decentralized, where vendor management is generally handled by individual business and IT departments, which has the benefit of being more responsive to issues as they arise but can have laxer and less consistent controls. The third, more recent approach, is hybrid, where vendor management functions are spread across the organization, according to where they fit best. In this scenario certain functions, such as contract management, could be handled centrally, while others, such as performance management, are carried out by individual departments or IT groups.


3 steps to create a digital banking relationship center

CIOs should help the business steer the transformation to a digitally enabled contact center. However, the work also involves developing an integrated platform that can help a new breed of “empathetic advisors” to serve customers. For instance, CIOs can help develop capabilities that allow intelligent machines to produce data insights that are shared with customer service agents so that they can deliver the most personalized services possible.  Right now, legacy technology hinders positive customer experiences. Data is piecemeal, there are gaps in knowledge, and silos make it near impossible to gain a comprehensive view of the customer. With an effective platform strategy, banks can break through the silos and deliver personalized customer experiences.


Use the cloud to achieve operating model transformation: Six steps

This also means that the cloud should be just one component of corporate IT's operating model transformation. Cloud can't be expected to deliver top-line digital results without software design and development strategies that connect it to digital business projects. This has spurred IT leaders' efforts to bring DevOps into enterprises and include the use of teams that coach developers to code for cloud-based architectures. It's not possible for cloud to deliver without new business engagement approaches that connect cloud adoption strategies to opportunities for scale and innovation. In one organization we studied, this led to the establishment of "cloud champions" -- commercial product managers who teach business unit leaders how to make use of cloud capabilities in their products for better commercial outcomes.


6 Robo Advisory Firms Trying Hard to Innovate

We’re often puzzled as to how people can use terminology to put an entirely new coat of paint on something that’s been around for a long time and then purport that it’s new and exciting. Take robo advisors as an example. In the good old days, you met with a wealth manager who asked you a bunch of questions and then plugged them into a piece of software that said where you should optimally invest your money. Usually, this is just a very simple asset allocation strategy with X% in bonds and Y% in stocks based on your age and appetite for risk. Then, as a result of the perpetual cost savings initiatives that permeate the corporate world, we decided to remove the human element, externalize the software, and call it a “robo advisor”, a term that somehow implies that there is more under the hood than there actually is.


Is your sandbox strategy keeping you safe?

To better detect APTs, security professionals are deploying advanced threat detection and protection technologies, often including virtual sandboxes which analyse the behaviour of suspicious files and uncover hidden, previously unknown malware. However, threats are getting smarter each day, and many vendors’ sandbox techniques simply have not kept pace. An APT is a set of stealthy and continuous computer hacking processes, often orchestrated by criminals targeting a specific entity. These threats often include unknown and undocumented malware, including zero-day threats. They are designed to be evolving, polymorphic and dynamic. They are targeted to extract or compromise sensitive data, including identity, access and control information. While these types of attacks are less common than automated or commoditised threats that are more broadly targeted, APTs pose a serious emerging threat.


Data Center Automation Advances Hybrid Cloud

A recent study by Morar Consulting concludes that it's difficult to realize savings in energy consumption and efficient operations without a data center infrastructure management system (DCIM). The implication of the study is that data centers have gotten so complex with so many invisible, moving parts, including virtual servers and containers, that average systems administrators/data center operations specialists need all the help they can get. Morar contacted 101 CIOs, CTOs and senior data center managers across the US for the study and another 100 in the UK. "A successful DCIM solution deployment allows managers to understand, manage and optimize the myriad amounts of data under their control, the Morar study said. The study was sponsored by Intel and Schneider Electric.


Get Ready for the Post-Cloud World

In the fast-approaching post-cloud world, the new differentiator is in how well an organization procures and consumes many digital business assets simultaneously, dynamically, and in deep synergy — across a diverse and complex supply chain and easily reconfigured installed base. Technology’s new poster child, artificial intelligence (AI), and its engine, machine learning, will be a major reason why cloud goes mundane, practically invisible, fairly soon. To make the most of a spectrum of apps and data hosting options, many variables need to be considered, evaluated, measured, reassessed and implemented. Repeat. All the time. To me, managing the complexity of such hybrid computing real-time procurement is a killer app of AI. These are not necessarily the skills resulting from completing a Microsoft certification process.


The Robot Automation Tipping Point

There are robots all around you, and you didn’t notice. From factory automation to self-driving cars, to drones, to the machines that make your pizza or how you paid for your parking. To the wild and fun automation that happen at a Disney resort. You can also build bots to analyze large amounts of data. Do you trade stocks? Well, 99% of the processing is happening on computers rather than brokers. That’s right even the trades are not completed by brokers anymore. People are going crazy over “Chat Bots” however it’s part of the automation tipping point. Humans love stories. We love reading books and engaging in conversation. The Message interface is simple and universal. This allows all ages and all walks of life to understand it, and use it. But what about all those people losing their jobs?



Quote for the day:


"In programming simplicity and clarity are not a dispensable luxury, but a crucial matter that decides between success and failure." -- Dijkstra


Daily Tech Digest - July 05, 2017

Seven tips for working with Office shapes

Shapes are drawing objects—lines, circles, rectangles, and so on—that you can use to enhance Office documents. You might add a simple line to distinguish your name and address in your resume. Or you might add a bit of pizazz to a marketing document. Shapes are available in Excel, Outlook, Word, and PowerPoint. You can enhance them using colors, patterns, borders, and other special effects. ... If you find yourself tweaking the same shape each time you enter it, stop. Instead, modify one shape and then set its properties to the shape's default properties. To do so, right-click the modified shape and choose Set As Default Shape from the shortcut menu. All subsequent shapes will exhibit your custom properties instead of the out-of-the box defaults.


Minerva protects endpoints with trickery and deception

The idea is that most normal threats will be blocked by traditional antivirus and Minerva will stop anything that attempts to get around that protection. In fact, Minerva officials stress that their toolset won’t protect anything without some type of antivirus first installed. It’s designed to work with any antivirus program, including Windows Defender and any of the offerings from companies like Symantec, McAfee, AVG, TrendMicro and others. The Minerva protection is installed as software, with the main interface and console running locally on a customer’s server or based within the cloud. Our test program was active on a physical server. Once installed, the program pushes agents out to every endpoint that needs to be protected. The agents are very lightweight, with each one taking up about 24 megabytes.


What are the impacts of facial recognition tech on society?

The perception was much different pre-September 11, 2001. This perceived type of futuristic technology was only something people saw in Hollywood and fell under the umbrella of Big Brother is watching us. At Super Bowl XXXV the federal government ran a test in which it scoured the 100,000 attendees and reported to have found 19 potential risks. This test was subsequently discovered by the media, leading to public conversation on privacy concerns. When questioned about the secret test, Tampa police spokesman Joe Durkin expressed, “It confirmed our suspicions that these types of criminals would be coming to the Super Bowl to try and prey on the public.” The dilemma, which in my opinion was the result of 9/11, becomes a conversation about improved security and the impact on our personal privacy.


Consumer IoT vs. Industrial IoT – What are the Differences?

Because IIoT systems can result in the generation of billions of datapoints, consideration also has to be afforded to the means of transmitting the information from the sensors to their final destination – usually an industrial control system such as a SCADA (supervisory control and data acquisition) platform. In order to not overwhelm these centralized systems with data, IIoT manufacturers are increasingly devising hardware that can carry out preliminary analytics directly at the device-level rather than on a program running in a cloud-based server (an emergent methodology known as edge computing or fog computing). Consumer IoT applications naturally tend to involve fewer devices and data points. Minimizing throughput to central servers is therefore less of a concern.


Big data is helping reinvent companies through digital transformation

Stratio brings together the latest, most disruptive technologies into a product that transforms businesses in-flight. Stratio DataCentric is designed to help Fortune 500 companies radically overhaul their information architecture in small, tactical steps and put data at the core of their business so that daily decisions and strategic planning can be taken based on powerful insight and well-grounded knowledge. With a lack of skills and inertia in organisational culture being major barriers to transformation, we provide both the technology and the skills that companies need. Our team of 300 staff accompany clients on their journey through transformation and we often establish joint-ventures. ... We have also developed a digital promotions platform that sends targeted deals to the customers of a global retailer in real-time.


8 Things Every Security Pro Should Know About GDPR

Companies that fail to comply with GDPR requirements can be fined between 2% and 4% of their annual global revenues or up to €20 million - which at current rates works out to just under $22.4 million USD - whichever is higher.  Enforcement of GDPR begins May 25, 2018. It replaces Data Protection Directive 95/46 EC, a 1995 statute governing the processing and protection of private data by companies within the EU. One of its biggest benefits for covered entities is that GDPR establishes a common data protection and privacy standard for all member nations within the EU. Organizations within the EU and elsewhere will still need to deal with data protection authorities in each of the 28 member countries. But they will no longer be subject to myriad different requirements from each member nation.


Alex Pinto on the intersection of threat hunting and automation

The environments where I've seen the most success with threat hunting utilized their incident response (IR) team for the task or built a threat hunting offshoot from their IR team. These team members were already very comfortable with handling incidents within the organization. They already understood the environment well, knew what to look for, and where they should be looking. IR teams may be able to spend some of their time proactively looking for things and formulating hypotheses of where there could be a blind spot or perhaps poorly configured tools, and then researching those potential problem areas. Documentation is key. By documenting everything, you build organizational knowledge and allow for consistency and measurement of success.


Operationalizing Data Governance: Actualizing Policies for Data Validation

Business are finally recognizing the value of data governance as an enterprise program, with the emergence of the Chief Data Officer (CDO) as a C-level role charged with ensuring that corporate data sets meet defined business data quality expectations. There are two aspects to this governance: defining policies, and operationalizing their compliance. One straightforward approach to data validation is to make it part and parcel of the application architecture: adjust the development methodologies to direct software designers to embed data validation as an integral component of their applications. Institutionalizing data validation within the organization’s application environment is predicated on standardizing an approach for defining data quality. Yet as is becoming more apparent, the definition of “data quality” is non-monolithic.


Less Is More For Canadian Quantum Computing

Researchers in Canada have found a way make a key building block for quantum computing from a custom photonics chip and off-the-shelf components intended for use in telecommunications equipment. They have built a chip that can create entangled pairs of multicolored photons. The result is that they can be manipulated as two "qudits," quantum computing digits, that can each hold 10 possible values. Where classical computers operate on values in sequence, quantum computers are able to express all possible values of a variable simultaneously, collapsing to the "right" answer at the end of the calculation. Not all computing problems benefit from this treatment, but it is particularly useful in the factorization of large numbers, necessary for cracking many forms of encryption.


How to use the mediator design pattern in C#

In the mediator design pattern, the objects don’t communicate with one another directly but through the mediator. When an object needs to communicate with another object or a set of objects, it transmits the message to the mediator. The mediator then transmits the message to each receiver object in a form that is understandable to it. By eliminating the direct communication between objects, the mediator design pattern promotes loose coupling. ... Note that the mediator design pattern differs from the facade design pattern. The mediator pattern facilitates how a set of objects interact, while the facade pattern simply provides a unified interface to a set of interfaces in the application. Thus the mediator pattern is a behavior pattern that deals with object behavior, the facade pattern is a structural pattern that deals with object composition.



Quote for the day:


Brooks' Law – "Adding manpower to a late software project makes it later."


Daily Tech Digest - July 04, 2017

The Internet of Things for banks: it’s here, it’s real, and it’s set to grow exponentially

Overall, with access to the unique data of individual homes, IoT has the potential to create personalised mortgages, factoring in both the property and the person. In fact, the financial sector can bring progress to the housing market with a whole range of technologies. The immediacy of transactions run on distributed ledger technology could verify house purchases more efficiently; processing time can be improved by big data stored in the cloud. But with the rise of data and connected devices, CIOs and CISOs are clearly under increasing pressure. Not only from the demands for increased scalability, as the sources of data presented by the rise of the IoT grows, but also building trust around security. So how can financial services firms manage and analyse the huge amount of data the IoT generates, while exploiting the flexibility of on-demand services, without compromising security?


The Fourth Industrial Revolution: Why you need a global IoT strategy

Get past the pilot stage of IoT that many organizations seem to be stuck in. Even if you are just trialing IoT technology, you should already know the next step you will take if the trial works. For instance, if you are trialing sensors to track goods flows through your warehouse, your next step might be to place sensors on goods and on trucks so you can track transport. Your end goal might be 360-degree visibility of all goods in your supply chain from point of origin to point of shipment. Adopt IoT technologies that can interoperate with each other. There are still incompatibilities between IoT products from different vendors. Find vendors that participate in standards groups that promote device interoperability. Keep your focus on security. Botnets like Mirai will continue to attack IoT. Your security should be regularly updated so you have the best possible protection.


Should organisations be switching their certificate authority?

To ensure the security of your cryptographic assets, agility is key. Let’s say your CA is compromised by a cyber attack and your certificates from that CA move to an untrusted state. First, you have to be able to locate all impacted certificates. You’ll then need to reissue certificates from another CA. Which CA’s management console will you use to complete this arduous task? Can you expect any CA to provide the functionality that helps you move certificates to a competing CA? Granted, a compromise is probably the worst-case scenario for switching CAs. However, there may be other cases which are less dramatic, but are still as important to address. Let’s say the employees at your CA make an (all too human) mistake such as mis-coding a batch of certificates or accidentally revoking a root certificate. You’re basically left in the same situation; you experience a service outage.


Get started with the Windows Subsystem for Linux

Technically, WSL is for console-only applications, providing shell support for developer tools and remote access to Linux servers running on-premises and in the public cloud. But it’s turned out to be a lot more flexible, and although this is not officially supported, users have installed and run X-based GUI applications, using Windows X Servers to bring a full Linux desktop experience to WSL. Working with any of the WSL personalities is like working with native Linux. You’ve got access to a shell, and through it the command line. Installing applications is as simple as using apt-get on Ubuntu or yast and zypper on Suse. When Fedora makes its way to Windows, you’ll use yum. Early WSL builds had problems running some applications, because the key dependencies weren’t supported. But since the Windows 10 Anniversary Update release, it’s been a lot easier, and now even complex packages like Docker install and run.


The Hidden Pitfalls Of Going Freelance In IT

“This is exacerbated in the IT world, because more often than not, you are going to be working remotely,” says Brattoli, who’s been freelancing on and off for his entire IT career. “Technology is wonderful in that it makes it possible for us to work from anywhere with an Internet connection. But there is still value in being able to meet face-to-face, and many companies are hesitant to trust someone they haven’t met.” In addition, at many companies the tech-savvy people running a project will know what needs to be done to meet the desired outcomes. “But once that’s all figured out, it is very hard to convince the people above them to go through with it,” Brattoli says. “Where technology is concerned, people who are less tech-savvy are going to be wary of any new changes to infrastructure.”


Analysis: Top Health Data Breaches So Far in 2017

As of July 3, 149 breaches affecting a total of nearly 2.7 million individuals have been reported to federal regulators so far in 2017, according to the Department of Health and Human Services' so-called "wall of shame" website of breaches affecting 500 or more individuals. Of those 2017 breaches, 53 are listed as hacking/IT incidents. And although they only represent about one-third of the breaches reported in 2017, those incidents are responsible for affecting 1.6 million individuals, or about 60 percent of the victims impacted. Those incidents include a ransomware attack reported to HHS on June 16 by Airway Oxygen, a Michigan-based provider of oxygen therapy and home medical equipment. That incident is listed on the federal tally as affecting 500,000 individuals, making it the second largest health data breach posted so far this year.


10 tips for mastering PowerPoint

With an estimated 500 million users worldwide, PowerPoint remains a key presentation tool in many enterprises, with Microsoft recently adding collaboration tools to further enhance its utility in the workplace. However, the platform includes many features that often fly under the radar, but can give your slide decks a boost. "We all know how easy it is to create and deliver a bad, mind-numbing presentation," wrote TechRepublic feature editor Jody Gilbert. "Fortunately for both presenters and their hapless victims, various add-ons are available to make presentations more functional and compelling." Here are 10 popular TechRepublic articles with tips for becoming a Microsoft PowerPoint expert and getting the most out of the presentation program.


The Growing Role of Machine Learning in Monitoring

While the adoption of machine learning in DevOps is relatively slow compared to other industries, the potential is huge. To start understanding what has to gain from this rapidly developing field, one needs only to look at the world of monitoring and log analysis, where machine learning can be used to alleviate some of the main pain points experienced by DevOps teams — namely, the analysis of vast volumes of data and the extraction of actionable insights from this data. Based on the monitoring solutions on show at Monitorama this year, I can safely claim that in this space at least, the machine learning revolution is well under way. ... Using a combination of supervised and unsupervised machine learning algorithms, Moogsoft promises to decrease the signal-to-noise ratio of alerts and correlating those alerts across your toolsets in real-time.


Make room for AI applications in the data center architecture

"If you look at the deep learning algorithms, they're extremely communication-intensive," Dekate said. "Trying to architect solutions for such a chatty application stack is going to be very hard for organizations to get their heads around." As data center networking architects prep their infrastructures for AI they must prioritize scalability, which will require high-bandwidth, low-latency networks and innovative architectures, such as InfiniBand or Omni-Path. The key is to keep all options open for automation, Perry said. The market is quickly maturing with automated data center infrastructure management technologies, a sign that automation is becoming more widely accepted in data centers. "Once more automation features are in place … this will help set the stage for the introduction of AI," Perry said.


Digitisation to transform the UK’s criminal justice system

Digitisation provides the opportunity to re-build the processes of the justice system around the citizen. Pilot initiatives such as the digital case file and online plea submissions have begun to prove the concept in practice, showing how digitisation can increase access to justice whilst reducing costs, streamlining processes and improving quality. Liz Crowhurst, policy officer, The Police Foundation and the report’s author, said: “At a time when justice agencies are under pressure to reduce costs, even as the complexity of cases increases, digitisation offers significant opportunities to radically improve services while increasing cost-efficiency and transparency. This, in turn, will deliver improved outcomes for victims, witnesses, defendants and offenders.”



Quote for the day:


"Next Industry to Embrace Blockchain is Aerospace" Accenture - CoinTelegraph


Daily Tech Digest - July 03, 2017

The numbers don’t lie: Why women must fill the data scientist demand

Being able to inspire a team and see the big picture are both important. A data scientist must be able to not only collect and analyze data but draw meaningful insights and understand what it means for the company. The ability to holistically view a situation is a competitive differentiator for organizations as well as a positive attribute that many women possess. Once we begin associating a variety of skills with data science, the perceptions of our industry can change. According to the Washington Post, women now make up 40 percent of graduates with degrees in statistics – a popular starting point for a career in data science. While a degree in mathematics is a great place to start, it’s important not to categorize the position as being completely scientific and technical, only suited for individuals who excel at math and science.


Global Risks in 2040: Q&A with Andrew Parasiliti

The center recently undertook an effort to envision the world in 2040, and the security challenges that will shape it: artificial intelligence, 3-D printing, the accession of the millennial generation, and the sheer speed with which our society moves and makes decisions. The lead investigators are all early-career researchers, drawn from fields as diverse as nuclear strategy, anthropology, and microeconomics. ... We have more projects underway on artificial intelligence. We're working on a study about how the growth in communications technology, the Internet of Things, and big data are all redefining and compromising privacy, and what that means for security. We are, in general, interested in the changing nature of power and governance in the international system, and, increasingly, how that links up with the challenge of what Michael Rich, RAND's president and CEO, has been calling “truth decay.”


UK tech sector urges public to embrace robots in the workplace

UK tech experts say that the appeal of robotics and AI has the potential for reducing menial tasks in the workplace. In fact, 85% of respondents think that the biggest benefit of robots will be increased efficiency, with a further 40% of the opinion that these efficiencies will be felt in the wider UK job market. A fifth of respondents (21%) think robots will actually create job opportunities, where 20% said it will reduce job opportunities. “Robots and AI have the potential to save tech professionals from menial tasks and free up their time considerably. Humans are capable of so much more than administrative paper pushing, so if robots are able to alleviate some of these pressures, who knows what more we can achieve,” commented Tom Butterworth, managing director of Silicon Valley Bank’s Early Stage Practice.


The Internet of Things will power the Fourth Industrial Revolution

By 2020 more than 50 billion things, ranging from cranes to coffee machines, will be connected to the internet. That means a lot of data will be created - too much data, in fact, to be manageable or to be kept forever affordably. Gateways can help; they not only dispatch traffic but carry out some analytics functions, so that data can be better managed. For example, they could be used to filter out ‘normal’ data over time and to look for unusual patterns which may indicate a problem. They can also improve the costs of the transmission and storage of all that data. In next-generation network technology, these gateways will be used dynamically as part of the network where and when needed. But this brave new world is not without its challenges. One by-product of more devices creating more data is that they are speaking lots of different programming languages.


Robotics, Dentistry and the Future

Robotics is also making dental implant surgeries safer, quicker and more precise. In fact, pairing robotics with digital 3-D mapping using 3-D printers makes work easier by multitudes. Models produced through 3D Printers for implant procedures make for finely detailed end-results. These 3-D Models can be the best surgical guides. These can be used for not just pre-surgical planning assistance but also to provide inter-operative positioning verification. Again, supporting the fact that instead of feeling threatened by technology and dental robotics, dentists need to embrace this growing field with open arms. Another emerging trend is the use of nanobots. These bacteria sized robots help perform procedures that normally would not be possible by the human hand.


Introducing wait stats support in Query Store

Nobody likes to wait. SQL database is multithreaded system that can handle thousands of queries executed simultaneously. Since queries that are executed in parallel compete for the same resources (tables, memory, etc) they might need to wait for the resources to be available to proceed with execution. These cumulative waits can be very large and downgrade the whole database performance. There are more than 900 wait types in SQL Server. Some are more important/frequent than others. For a long time, the only way you could get closer to understanding what is waiting bottleneck of your workload was to look at instance (sys.dm_os_wait_stats) or recently added session (sys.dm_exec_session_wait_stats) level wait statistics. These options have certain limitations and might not provide optimal experience


Artificial intelligence turns critical for banks facing nimble fintech rivals

For traditional banking institutions facing competition from fintech players which are more nimble, AI is a critical tool to improve customer experience. One such example of this is facial recognition technology, which is 10 to 15 times more accurate than human beings at identifying people. In its several other forms, such as advanced virtual assistances and voice assistants, AI software is proving to be faster and better than service agents in responding to customer queries and emails at the contact center level. Australia’s Westpac is using visual recognition to enable customers to activate their new cards via their smartphones. Barclays have been using voice recognition to authenticate telebanking customers, adding another layer of security to its processes. Japan’s Mizuho Bank and Mitsubishi UFJ are using AI-based robots to manage their front desks and take care of routine customer queries.


Risk analytics enters its prime

Recognizing the value in fast and accurate decisions, some banks are experimenting with using risk models in other areas as well. For example, one European bank overlaid its risk models on its marketing models to obtain a risk-profitability view of each customer. The bank thereby improved the return on prospecting for new revenue sources. A few financial institutions at the leading edge are using risk analytics to fundamentally rethink their business model, expanding their portfolio and creating new ways of serving their customers. Santander UK and Scotiabank have each teamed up with Kabbage, which, using its own partnership with Celtic Bank, has enabled these banks to provide automated underwriting of small-business loans in the United Kingdom, Canada, and Mexico, using cleaner and broader data sets. Another leading bank has used its mortgage-risk model to provide a platform for real estate agents and others providing home-buying services.


Blockchain for mainstream banking

It is indeed strange to note that the arrival of Bitcoin and the likes were to create an independent currency mechanism apart from central bank controls. However, today, central banks are keen to explore how the adoption of blockchain technology can help make the financial system more transparent, fast, efficient, and secure and track every piece of hard currency (such as the British pound and yuan) travelling through the financial system in real time. This is virtually impossible at the moment. Here in Malaysia, Bank Negara Malaysia (BNM) implemented a fintech regulatory sandbox which took effect in October 2016. BNM recognises the potential of new methods and technology that are emerging so rapidly, and is encouraging local banks to explore and experiment without fear. With central banks having interests in the blockchain game, we have better confidence in this matter.


Avoid these 5 IT vendor management worst practices to avoid IT audit trouble

Determining how and for what an IT vendor gets paid can provide great insight on how effectively an organization manages these groups. I’m not saying this just because I am a CPA. But because I am one, I’ve had the opportunity to perform audits of these invoices and experienced many invoice surprises. Like with many business processes, some discrepancies are truly honest mistakes and misunderstanding of contract provisions. Unfortunately, not everything is. Not only should organizations recalculate the mathematical accuracy of invoices and compare the calculation to the contract, but they should very the source of the information (e.g., number of transactions) provided. If you can’t gain satisfaction over the integrity of your vendor’s billing process, you probably will also have a vendor service delivery oversight problem as well.



Quote for the day:


"Progress doesn't come from early risers; progress is made by lazy men looking for easier ways to do things." -- Robert Heinlein


Daily Tech Digest - July 02, 2017

What makes identity-driven security the new age firewall

Using a proprietary algorithm, Microsoft Advanced Threat Analytics works round the clock, continually grasping behaviour of organizational entities, such as users, devices, and resources, and helps customers adapt to the changing nature of cybersecurity attacks. In addition to this, the technology enhances threat and anomaly detection with the Microsoft Intelligent Security Graph, which is propelled by enormous amounts of datasets and machine learning in the cloud. “Identity is the new firewall. If you are taking a traditional end point/device protection approach then you are short changing your organization goals. It is critical to understand that the perimeter of IT includes users, apps across cloud and on premise, and most importantly data. Identity is what can help secure this perimeter,” says Rajiv Sodhi


Enabling IoT Ecosystems through Platform Interoperability

To enable interoperability for IoT platforms on the cloud, fog, or device level, the BIG IoT API offers a well-defined set of functionalities. Seven functionalities are crucial. The first is identity management to enable resource registration. The second is discovery of resources according to user-defined search criteria. The third is access to metadata and data (data pull as well as publish-and-subscribe for datastreams). The fourth is tasking to forward commands to things. The fifth is vocabulary management for semantic descriptions of concepts. The sixth is security management, including authentication, authorization, and key management. The seventh is charging that allows the monetization of assets through billing and payment mechanisms.


Artificial Neural Networks (ANN) Introduction

We recognize images and objects instantly, even if these images are presented in a form that is different from what we have seen before. We do this with the 80 billion neurons in our brain working together to transmit information. This remarkable system of neurons is also the inspiration behind a widely-used machine learning technique called Artificial Neural Networks (ANN). Some computers using this technique have even out-performed humans in recognizing images. ... An ANN model is trained by giving it examples of 10,000 handwritten digits, together with the correct digits they represent. This allows the ANN model to understand how the handwriting translates into actual digits. After the ANN model is trained, we can test how well the model performs by giving it 1,000 new handwritten digits without the correct answer.


Alibaba: Building a retail ecosystem on data science, machine learning, and cloud

The war in retail has long ago gone technological. Amazon is the poster child of this transition, paving the way first by taking its business online, then embracing the cloud and offering ever more advanced services for compute and storage to thirrd parties via Amazon Web Services (AWS). Amazon may be the undisputed leader both in terms of its market share in retail and its cloud offering, but that does not mean the competition just sits around watching. Alibaba, which some see as a Chinese counterpart of Amazon, is inspired by Amazon's success. However, its strategy both in retail and in cloud is diversified, with the two converging on one focal point: data science and machine learning (ML).


The Future is Imminent: 9 Design Trends for 2018

For those uneducated graphic designers in the audience, the term synesthesia refers to the perceptual condition of mixed sensation; a stimulus in one sensory modality (like hearing) involuntarily elicits a sensation or experience in another sensory modality (like smell). A person with synesthesia might hear a bird chirping and all of a sudden smell the scent of popcorn, or taste the flavor of mint, or feel the sensation of floating. ... A progress spectrum is a far more natural way of measuring “progress.” Instead of breaking up the user experience into unnatural, linear, paginated steps, a progress spectrum reflects the true experience of the user, one in which progress is experienced along a broad and continuous spectrum, where one event seamlessly flows into the next.


The Cyber-frauds

A mobile wallet works like an electronic prepaid card and can be used to pay for things ranging from grocery to rail tickets without the need to swipe the debit/credit card. All you have to do is to key in the username and password for logging in. The app can be loaded with money either through debit/credit card or net banking. The flip side is that these wallets mostly rely on the phone's locking system for security and don't ask for any PIN or password while the payment is being made. ... Rahul Gochhwal, co-founder of Trupay, says, "The biggest security issue is lack of second factor of authentication (password) while transacting. This makes them vulnerable to system-level breaches as transactions can be system generated by a hacker without a password. Thus, technically, a hacker can make thousands of fraudulent transactions simultaneously."


What every CIO Needs to Know About Cyber Resilience

Bohmayr & Türk, from the Boston Consulting Group, write that “cyber-resilience in an organization must extend beyond the technical IT domain to the domains of people, culture and processes. A company’s protective strategies and practices should apply to everything the company does — to every process on every level, across departments, units and borders, in order to foster an appropriately security-conscious culture.” ... The issue of board responsibility and oversight of cyber risks isn’t new. In 2015, the Cybersecurity Disclosure Act of 2015 bill was introduced in the US Senate. The bill would have required “public companies to disclose whether any board member has experience or expertise in cybersecurity, and to describe the nature of that background” and should no board director have cybersecurity expertise, to justify why such expertise was unnecessary.


Security in a silo – breaking down the barrier between CISOs & C-Suite

If you’ve been in the security industry for any length of time, and as a CISO I assume you have been, you’ve probably already seen and read such articles… but if you are still reading this, it probably means that you don’t feel like this is yet a reality within your organisation. So, with the shared understanding that we are both in agreement that this shift is past due, we can start to talk about building your strategy to make it happen. Before doing so, however, we need to acknowledge a sobering truth: People don’t care about security for the sake of security alone. What they care about is the result that a sound security strategy can provide and the impacts/risks associated with the lack of a sound security strategy. We’ll use this understanding to inform the methods that we use to engage the organisation and our board.


The Hard-Dollar Benefits of GRC Consolidation

The intention of implementing a single platform architecture must come from an IT vision for rationalizing applications in use across the company. Instead of supporting potentially hundreds of applications that each do only one thing, the GRC infrastructure, in time, should comprise one cohesive platform that supports many functions. GRC applications tend to span activities and departments within organizations. Groups across the company often independently manage activities such as risk assessments, audits, controls testing and third-party assessments. To do this, they make use of many individual solutions – some of which do the same thing, just in different organizational silos. For secure business management, this colossal set of single-use applications has to be visible, managed, supported and maintained. This is something that is both costly and time inefficient.


The Computest Story: The Transformation to an Agile Enterprise

Inspired by Henrik Kniberg & Anders Ivarsson's famous article on how Spotify scaled their development organization we decided to put multidisciplinary teams in the center, supported by a group of people outside the teams focusing on coaching and fulfilling company-wide responsibilities. As Figure 2 indicates, the major difference in the first transformation step was to integrate as many central functions in interdisciplinary teams as possible, to structure them by industries and to differentiate the leadership group in 'captains' and 'coaches'. Whereas the captains took over social leadership for the teams as well as responsibility for resource and account management, the coaches formed a group of thought leaders with a broad variety of subject matter expertise, responsible for both policies and solutions.



Quote for the day:


"Thinking is the hardest work there is, which is probably the reason so few engage in it." -- Henry Ford


Daily Tech Digest - July 01, 2017

Windows 10 to Get Built-in Protection Against Most Ransomware Attacks

In the wake of recent devastating global ransomware outbreaks, Microsoft has finally realized that its Windows operating system is deadly vulnerable to ransomware and other emerging threats that specifically targets its platform. To tackle this serious issue, the tech giant has introduced a new anti-ransomware feature in its latest Windows 10 Insider Preview Build (16232) yesterday evening, along with several other security features.... The anti-ransomware feature, dubbed Controlled Folder Access, is part of Windows Defender that blocks unauthorized applications from making any modifications to your important files located in certain "protected" folders. Applications on a whitelist can only access Protected folders.


How IoT is empowering the elderly to become healthier and more productive

The central feature is the use of a personal dashboard for the patient, with a personalized treatment plan. The steps are then sent to smart devices to notify patients when they need to inject insulin, for instance. "It provides the healthcare team, not only the raw data, like the glucose reading or the insulin activity level," he said. "It provides everything that we collect from the patient. It actually provides them with all kinds analytic insights, diagnostics, and treatment plans." It's a "patient-involved system," he said, that can crunch numbers based on the AI engine to deliver specific recommendations. It will say "the intervention will be to take those medications at that time, at that dosage, and that they should follow up in one week, two weeks, etc.," he said. Still, the physician remains the ultimate gatekeeper, and can override the system if necessary.


Master data management driving better business decisions

While there is still a lot of excitement about new technologies such as cloud and graph database replacing so called “legacy” technologies, in reality the boring but strategic business processes of onboarding customers and suppliers, and transacting with your customers and suppliers remain fundamental business requirements that cannot be ignored. And these business fundamentals are greatly enhanced by the quality data that MDM brings to the equation. It seems as if the very large scale ($100 million and up) MDM programs or not as widespread as they were three to five years ago, which is causing some stress on the software vendors and the systems integrators. Many of the MDM programs we see are increasingly tactical rather than enterprise in nature.


Under pressure: Is it now make or break for net neutrality?

“We’ll see more pressure on regulators to adopt the US measures, as operators and content providers lobby for less regulation,” says Martin Morgan, VP marketing at Openet. “Operators have paid huge sums of money for spectrum and invested in rolling out 4G networks. With data becoming commoditised they’ll be looking for more innovative ways to deliver content services.” To do that they will of course have to keep a close eye on what’s going on in the US. Michael Hekimian, a legal director at law firm Ashfords says that the US will now become the “acid test” for new business models and in particular any alternate pricing models. If ISPs and content providers manage to improve services to consumers without raising prices and being anti-competitive then expect to see pressure on global regulators mount.


Active Cyber Defense Will Help Deter Future Threats

Retaliation is a response to a cyberattack that could manifest any number of ways. Responses include a mix use of sanctions, cyber responses like a direct “hackback” on the offender, and even a conventional kinetic attack in extreme cases. ... Denial is a form of active cyber defense in which an entity has such formidable cyber defenses that it removes the incentive of carrying out an attack, thus leaving little motive to carry out any further attacks. Denial processes include a cyber kill chain, where a company receives notification of an attack at multiple stages and is thus able to stop it. ... The third form of deterrence is entanglement, or norms created to regulate cyber behavior. Entanglement is a necessity for looking to prevent cyberattacks by state actors as it introduces accountability into their decision-making calculus.


Biometrics: Moving Far Beyond Fingerprints

Organizations have struggled for decades to find security tools that kept out bad guys while admitting authorized persons. This is both a physical security and cybersecurity issue. But, Dunkelberger adds, thanks to the impact of biometrics over the past few years, security is no longer quite as difficult. “Every day,” he says, “millions of people interact with a sliver of glass in their pocket that will tell them everything from the current age of the universe to when their shampoo will be delivered to their doorstep to how much money they have in their retirement account. Each of these interactions, thanks to biometrics, can be accomplished seamlessly and without friction. No longer are they required to create and remember a highly entropic code to use as a shared-secret; now they can simply look at that sliver of glass and blink.” Biometrics are changing the way we think about security.


The latest cyberattack is more than it seems

It transpired soon that the malware's developers didn't really want the money. There was a single email address specified for contact with the hackers, but it was soon blocked by the service provider, as usually happens in these cases. Besides, it turned out that the virus encrypted the victims' hard disks without the possibility of recovery. That's odd: An attacker who wanted money would have taken care he could receive it; or at least would have demonstrated his ability to decrypt the files. So cui bono; who benefits from this? Ukrainian officials were quick to accuse Russia of waging cyberwarfare against their country -- but that's almost white noise these days, coming from Kiev, and many observers were confused by the malware's seeming geographic indifference. It hit large Russian companies, too -- the state-oil giant Rosneft and the giant steelmaker Evraz, among others.


Advertisers are closer to knowing exactly where you are

Chris Clarke is chief creative officer at International DigitasLBi and he has strong words about what is an increasingly important part of his agency’s business. “The whole industry is talking data, and yet there remains a huge gulf between promise and proof,” he says. “Basic accuracy has been a huge issue with geolocation and elsewhere there's the issue of insight. The smartest operators are bringing multiple data sources together and looking for anomalies that lead to creative insight. Get this right and the outcome is relevant, useful and charming. Get it wrong and it's spooky, or just wrong.” Another interesting London company in the space is LoopMe, a mobile video platform that is driven by AI, employing algorithms that optimise ad placements in real time. It claims it can reach three billion consumers worldwide. LoopMe recently launched PurchaseLoop Foot Traffic, which uses AI to deliver video advertising at the moment customers are most likely to head to a store.


Machine learning is transforming lending

The front-end provides APIs for connectivity to the banks' own operational processes. This is where CapitaWorld's operational efficiency model also claims strengths. The fully digital form with inbuilt validated information creates efficiency through reduction in human-resource intensive processes. The queue time reduces from weeks to hours.  ... And finally the credit decision process itself. The model is based on machine learning. Prior decisions and rules as well as portfolio performance are captured by the platform. The vastly superior computing power today enables multiple hypotheses building and analysis. This in turn sets up new decision outcomes. What this also does is that pricing and risk decisions can be taken on much smaller sets of customers and even at an individual level. It is a step away from a standard Annualized Percentage Rate model. Imagine if your credit card interest rate was specific to you, based on your past behaviour.


Take Care of Yourself: CISO Self Care During Wartime

The challenge for Security leaders is that most organizations don’t really know what they want from their CISOs. During times of peace they want a diplomat — someone who can sit in the C-suite and talk about business objectives in non-technical terms. But when EternalBlue comes calling, they want a Commander-In-Chief/General/Drill Sergeant/Grunt to just make it all go away. The result is a CISO who has to bungee between the front lines and the corner office in the space of an hour. And make it look like you have complete control, because, you know, Leadership. ... Seriously, anyone in Security, and particularly the Security Leader, needs to have a significant support structure and coping mechanisms if they’re going to survive in the role which go beyond “take care of yourself”. Surround yourself with colleagues who can not only sympathize, but can help you find a way to emerge from a crisis with your sanity in tact.



Quote for the day:


"Don't raise your voice, improve your argument." -- Desmond Tutu


Daily Tech Digest - June 30, 2017

What is Docker? Linux containers explained

Containers decouple applications from operating systems, which means that users can have a clean and minimal Linux operating system and run everything else in one or more isolated container. Also, because the operating system is abstracted away from containers, you can move a container across any Linux server that supports the container runtime environment. Docker, which started as a project to build single-application LXC containers, introduced several significant changes to LXC that make containers more portable and flexible to use. Using Docker containers, you can deploy, replicate, move, and back up a workload even more quickly and easily than you can do so using virtual machines. Basically, Docker brings cloudlike flexibility to any infrastructure capable of running containers. Thus, Docker is often credited for the surging popularity of modern-day containers.


Teardown of 'NotPetya' Malware: Here's What We Know

The malware can spread by using two attack tools built by the "Equation Group" - likely the National Security Agency - and leaked by the Shadow Brokers. The tools generate packets that attempt to exploit an SMB flaw in prior versions of Windows. "The new ransomware can spread using an exploit for the Server Message Block (SMB) vulnerability CVE-2017-0144 (also known as EternalBlue), which was fixed in security update MS17-010 and was also exploited by WannaCrypt to spread to out-of-date machines," Microsoft says. "In addition, this ransomware also uses a second exploit for CVE-2017-0145(also known as EternalRomance, and fixed by the same bulletin). "Machines that are patched against these exploits (with security update MS17-010) or have disabled SMBv1 are not affected by this particular spreading mechanism."


Eight obstacles to overcome in your digital transformation journey

"Digital transformation involves a significant change, and usually changes to people's jobs, compensation, bosses, and the type of work they do," said Marc Cecere, vice president and principal analyst on Forrester's CIO role team. "Making that kind of change is difficult, and is something where there is not a lot of science. Make sure you have someone on board who knows how people's minds are changed, and how to adapt to new business models." Organizational siloes are one of the biggest impediments to digital transformation efforts, in terms of understanding the customer journey, said Gianni Giacomelli, chief innovation officer at Genpact, and head of its Genpact Research Institute. Often either the IT group or the business lines try to solve it, and do not work together deeply, Giacomelli said.


Critical Infrastructure Protection: Security Problems exist despite compilance

The electronic security perimeter (ESP) is the control systems, server room, telecom room and so on. The critical cyber-assets will fall under this section of CIP. For the most part, entities covered by CIP will spend a good deal of time and energy constructing a hard exterior (the ESP), but assets contained within – the guts – are soft. "We're talking fairytale darkness here, all of the stuff you see on television when the power grid goes down, that's going to happen when the ESP is successfully breached," Grimes said. You would think that the ESP would be the ultimate hard point, but it isn't in most cases. physical access controls (PACs) are not covered under the ESP section. For example, video cameras are a weak point, as they're not considered when it comes to the ESP.


Global shipping feels fallout from Maersk cyber attack

The impact of the attack on the company has reverberated across the industry given its position as the world's biggest container shipping line and also operator of 76 ports via its APM Terminals division. Container ships transport much of the world's consumer goods and food, while dry bulk ships haul commodities including coal and grain and tankers carry vital oil and gas supplies. "As Maersk is about 18 percent of all container trade, can you imagine the panic this must be causing in the logistic chain of all those cargo owners all over the world?" said Khalid Hashim, managing director of Precious Shipping (PSL.BK), one of Thailand's largest dry cargo ship owners. "Right now none of them know where any of their cargoes (or)containers are. And this 'black hole' of lack of knowledge will continue till Maersk are able to bring back their systems on line."


How to write event-driven IoT microservices that don’t break

One concept that jumped out at me was the notion of a “heisenbug,” which the article defines as “timing-related bugs that often disappear during an investigation of it.” The term “heisenbug” stems from the analogy of physics’ Heisenberg Uncertainty Principle, under which the attempt to observe a system inevitably alters its state. Where computing environments are concerned, heisenbugs are equivalent to probe effects, in which attaching a test probe—or simply sending an asynchronous test ping—to a system changes its behavior. What that implies is that the very act of trying to isolate, analyze, and debug some systemic glitches will alter the underlying systemic behavior of interest—perhaps causing the bugs in question not to recur. One of the chief causes of heisenbugs are race conditions, under which a system behaves erratically when asynchronous input events don’t take place in the specific order expected by that system’s controlling program.


Blockchain remains a work in progress for use in healthcare

Blockchain has inherent qualities that provide trust and security, but it is not a technological panacea for all that ails healthcare when it comes to cybersecurity, believes Debbie Bucci, an IT architect in ONC’s Office of Standards and Technology. “When I look across other industries, I don’t see any of them really aggressively adopting it,” says Bucci, whose primary focus is on the privacy and security aspects of health information exchange. “There’s a lot of proof of concepts, pilots and use cases being defined. But, I have yet to see major companies stepping up to support blockchain—beyond Bitcoin, of course.” According to Bucci, ONC continues to keep a close watch on what develops in the marketplace when it comes to blockchain, which is still evolving and maturing, especially with respect to its applicability to healthcare.


The 360 degree approach to cyber security

In order to take the right security measures, you need to understand where to direct your attention. A good start is to assess who the potential adversaries are, and what damages a security compromise can cause – a risk analysis if you will. Getting a full view of the attack surface is an integral part of this, but it’s not easy. Many companies don’t even know their digital footprint, leaving them unaware of potential entry points for attackers and threats. Plus, the IT systems in many companies have grown organically, resulting in intertwined systems, outsourced infrastructure, and 3rdparties that are digitally connected and integrated with business processes. Keeping all of this under rigid control is virtually impossible. And while there are technical solutions that provide the visibility you need, just mapping your digital footprint isn’t enough.


Five DevOps principles to apply to enterprise architecture

Because DevOps breaks down barriers that traditionally separate various teams within an IT organization, individual roles need to be malleable. For example, someone whose job title is "developer" should have the organizational flexibility to participate in IT operations work when needed. DevOps is about cultural practices, not specific technologies or tools. Still, DevOps works best when the IT team has modern, agile tool sets and frameworks at its disposal. For example, migrating from virtual machines to containers can help your organization manage DevOps more effectively. When designing your enterprise architecture, controlling access to sensitive information about the infrastructure or the data stored on it is important. But this need should be balanced against the importance of maximizing visibility across the organization.


Medical Device Cybersecurity: A Long Way to Go

In a statement provided to ISMG, the FDA says it generally does not comment on specific studies, "but evaluates them as part of the body of evidence to further our understanding about a particular issue and assist in our mission to protect public health. The FDA is carefully reviewing the findings of the report. The FDA takes medical device cybersecurity seriously , and we look forward to engaging directly with the sponsor of the report so we can have a better understanding of the report's data, methodologies of information collection and conclusions." The FDA also notes: "Medical device manufacturers must comply with federal regulations. Part of those regulations, called quality system regulations, require that medical device manufacturers address all risks, including cybersecurity risk.



Quote for the day:


"Do not be concerned that no one recognizes your merits; be concerned that you may not recognize others." -- Confucius