Daily Tech Digest - July 08, 2017

Robot Says "Culture" - Moving towards Teal

Teal is still just a colour. It is one of a set of colours which represent the pre-dominant state of consciousness of an organization. Influenced by Ken Wilber’s work on Spiral Dynamics, Frederic Laloux helped us reach a deeper understanding of the Teal state of consciousness in his book "Reinventing Organizations". Laloux shows how this state of consciousness has morphed over time with breakthroughs achieved at each stage enabling new methods for working which were not possible in the previous paradigms (colours). Even though the colours appeared over the course of time, this does not mean that one colour replaces another when a new shift occurs. Often, organizations get stuck in one predominant style of thinking due to it being deeply set within its culture, and this is where most organizations find themselves today- stuck.


Anti-Virus Conspiracy Theories Cut Both Ways

In case a theme isn't clear here, it's that the Russian government isn't the only cybersecurity threat in the world. But that doesn't mean any given country's cybersecurity firms are a threat. Back in 2012, for example, Schneier said it was highly unlikely that a government would attempt to compel any domestic cybersecurity firms to whitelist malware, simply because related knowledge would be so difficult to contain. "My guess is that the NSA has not done this, nor has any other government intelligence or law enforcement agency," Schneier said. "My reasoning is that anti-virus is a very international industry, and while a government might get its own companies to play along, it would not be able to influence international companies." Mikko Hypponen, chief research officer at F-Secure, also in 2012 said anti-virus firms collectively "want to detect malware, regardless of its source or purpose," and that "politics don't even enter the discussion, nor should they."


7 Layers of Security Each Business Owner Should Consider

Employees who go off to lunch with their computers on and exposed are inviting hacking, especially if those computers are in more “open” spaces of a business, such as a floor full of cubicles. Users who don’t think simple steps like locking their computers when away from their desktops, can invite an easy outlet for their information to be stolen. It only takes a few seconds for someone to use a memory card and steal your personal information. Another issue is in the disposal of old computers. Sanitizing and wiping procedures of old hard drives are at time not sufficient, and can allow hackers to retrieve information from those drives. There are a number of tools available to allow you to securely erase hard drives, or you can choose to get it done professionally. Physical security is one of the most overlooked aspects of security. If you cannot ensure that your hardware is physically secure, then there are steps you can take to improve security.


Best VPN services of 2017: Reviews and buying advice

In truth, it’s hard to select the best overall VPN. Some services are weaker on privacy, but are significantly easier to use, while others could stand an interface redesign. Nevertheless, the point of a VPN is to remain private and to have your internet activity kept as private as possible. For that reason, we’re choosing Mullvad as the best overall VPN. The interface needs a lot of work, but the company does a great job at privacy. Mullvad doesn’t ask for your email address, and you can mail your payment in cash if you want to. Like many other VPNs, Mullvad has a no-logging policy and doesn’t even collect any identifying metadata from your usage. Mullvad is also fast, even if it’s not the fastest VPN we’ve tested. Add a more user-friendly interface and Mullvad would be nearly unbeatable.


All you need to know about the move from SHA-1 to SHA-2

SHA-2 is the cryptographic hashing standard that all software and hardware should be using now, at least for the next few years. SHA-2 is often called the SHA-2 family of hashes because it contains many different-size hashes, including 224-, 256-, 384-, and 512-bit digests. When someone says they are using the SHA-2 hash, you don’t know which bit length they are using, but the most popular one is 256 bits (by a large margin). Although SHA-2 shares some of the same math characteristics as SHA-1 and minor weaknesses have been discovered, in crypto-speak it's still considered "strong” for the foreseeable future. Without question, it's way better than SHA-1, and any critical SHA-1 enabled certificates, applications, and hardware devices using SHA-1 should be moved to SHA-2.


Analyzing the Anthem Breach Class Action Settlement

"What will be interesting to see will be the kinds of claims that will be made against that [Anthem settlement] fund" Teppler says. "In the end you have [nearly] 80 million people at risk for ... identity theft," including medical identity theft, which can long-lasting ramifications. For example, he points out, if fraudsters make claims for health insurance coverage using stolen identities, those could impair individuals' ability to obtain life insurance because of false medical information being added to their records, he says. While most of the proposed provisions of the Anthem settlement are common in other data breach class action settlements, "one of things a bit novel [in the Anthem deal] is repayment of credit monitoring for already expended funds for victims," Teppler says.


Luna brings visual development to functional programming

Luna’s creators argue that because developers typically start sketching components and dependencies on a whiteboard before coding, it doesn’t make sense to then implement that logic only in text. Software can have thousands of lines of code distributed in hundreds of files, which can trip up the implementation of that visual data flow and application architecture. Tools such as UML architecture diagrams only deal with the symptoms and not the problem’s source, Luna’s creators argue. That’s why Luna features both visual and textual representations. Developers can maintain their coding habits while also having a graphical whiteboard-like interface. Luna’s visual representations reveal structure, behavior, and data flow. It allows prototyping and visual profiling to understand performance bottlenecks.


The next logical step in Google's Android-Chrome OS 'merger'

Google has little by little been bringing elements of Android into the Chromebook world. It started with a very limited and experimental attempt at making some Android apps compatible with Chrome OS devices. Then came the gradual visual alignment, with Android-like fonts and design making their way into one Chrome OS system app after another and eventually seeping into the platform's core interface. Factor in features such as the Android-Chrome connecting Smart Lock, the Android-inspired PIN Unlock, and the availability of the full Play Store on Chromebook devices—not to mention the Chrome-OS-inspired "seamless" update model that came to Android with last year's 7.0 release—and it's easy to see how all these little pieces are adding up to something big.


Intel SSD 545s review: The next great budget SSD has arrived

At $180 for 512GB, the 545s offers all the capacity and cost advantages of TLC NAND plus the sustained write speed of MLC NAND. Hands-on, the 545s was the smoothest, most consistent performer we’ve seen in a while. The OS popped, all types of file operations were silky, and there were no stutters when opening apps. The Samsung EVO is also smooth and consistent, but it drops to around 300MBps writing when it runs out of TLC-as-MLC cache. The EVO, unlike the 545s, does have its RAPID caching software, which can significantly boost everyday performance by using system RAM as cache. We’re not fans of memory caching as it increases the risk of data loss due to power failure, so we’re only mentioning that to ward off comments.


Rethinking what it means to win in security

Consider how it works in retail. A reality of selling goods is “shrinkage.” That’s a fancy way of explaining that goods get lost and stolen. The mindset requires the understanding that a situation where nothing is lost, broken, or stolen is simply unreasonable. Which causes us to consider what a reasonable amount of loss is. ... The key lesson here is that while what is considered reasonable has changed over time, it is not zero. Embedded in this shift is the economy of improvement: each reduction in shrinkage needs to cost less to obtain than the savings it generates. After all, it doesn’t make sense to spend $100 to protect $1. Coupling the cost of improvement with measured reduction in overall impact to the business is a reasonable way to understand success.



Quote for the day:


"What lies behind us & what lies before us are tiny matters compared to what lies within us." -- Ralph Waldo Emerson


Daily Tech Digest - July 07, 2017

Algorithmic Decomposition Versus Object-Oriented Decomposition

The principal advantage of object-oriented decomposition is that it encourages the reuse of software. This results in flexible software systems that can evolve as system requirements change. It allows a programmer to use object-oriented programming languages effectively. Object-oriented decomposition is also more intuitive than algorithm-oriented decomposition because objects naturally model entities in the application domain. Object-oriented design is a design strategy where system designers think in terms of ‘things’ instead of operations or functions. The executing system is made up of interacting objects that maintain their own local state and provide operations on that state information. They hide information about the representation of the state and hence limit access to it. An object-oriented design process involves


Virtual Panel: High Performance Application in .NET

Microsoft has made quite a few investments in platform performance. To cite some examples: .NET Native was introduced a few years ago to improve startup times and reduce memory usage for client apps; .NET 4.5 and 4.6 saw important improvements to the scalability of the garbage collector; .NET 4.6 had a revamped JIT compiler with support for vectorization; C# 7 introduced ref locals and ref returns, which are features designed to allow for better performance on the language level. All in all, it would probably be easier for me personally to write a small high-performance application in a lower-level language like C or C++. However, introducing unmanaged code into an existing system, or developing a large codebase with these languages, is not a decision to be taken lightly. It makes sense to bet on .NET for certain kinds of high-performance systems, as long as you are aware of the challenges and have the tools to solve them on the code level


Medical devices at risk: 5 capabilities that invite danger

Chris Camejo, director of product management, threat intelligence at NTT Security, noted that most medical devices in use today would be secure, “only in a closed, trusted environment without any potentially malicious activity." “Unfortunately a hospital network can't be considered trusted, as it is connected to the internet and contains thousands of internal users, any one of whom could click on the wrong link or download the wrong attachment,” he said. Still, debate continues about how imminent is the risk of physical harm. Jay Radcliffe, a medical device security expert and Type-One diabetic, famously said at the 2014 Black Hat conference that it would be far more likely for, “an attacker to sneak up behind me and deliver a fatal blow to my head with a baseball bat,” than to be harmed by a cyber attack.


Artificial Stupidity: Learning To Trust Artificial Intelligence (Sometimes)

While deep learning AI can surprise its human users with flashes of brilliance — or stupidity — deterministic software always produces the same output from a given input. “Machine learning cannot be verified and certified,” Cherepinsky said. “Some algorithms we chose not to use… even though they work on the surface, they’re not certifiable, verifiable, and testable.” Sikorsky has used some deep learning algorithms in its flying laboratory, Cherepinsky said, and he’s far from giving up on the technology, but he doesn’t think it’s ready for real world use: “The current state of the art (is) they’re not explainable yet.” ... “You see in artificial intelligence an increasing trend towards lifelike agents and a demand for those agents, like Siri, Cortana, and Alexa, to be more emotionally responsive, to be more nuanced in ways that are human-like,” David Hanson, CEO of Hong Kong-based Hanson Robotics, told the Johns Hopkins conference.


Experts Predict When Artificial Intelligence Will Exceed Human Performance

The experts predict that AI will outperform humans in the next 10 years in tasks such as translating languages (by 2024), writing high school essays (by 2026), and driving trucks (by 2027). But many other tasks will take much longer for machines to master. AI won’t be better than humans at working in retail until 2031, able to write a bestselling book until 2049, or capable of working as a surgeon until 2053. The experts are far from infallible. They predicted that AI would be better than humans at Go by about 2027. (This was in 2015, remember.) In fact, Google’s DeepMind subsidiary has already developed an artificial intelligence capable of beating the best humans. That took two years rather than 12. It’s easy to think that this gives the lie to these predictions.


Major cyber incidents accelerating, says NCSC

“This increase in major attacks is mainly being driven by the fact that cyber attack tools are becoming more readily available, in combination with a growing willingness to use them,” he told The Cyber Security Summit in London. Although the WannaCryransomware attacks in May 2017 came very close, Noble said there had been no C1-level national cyber security incidents to date. The majority of the major incidents the NCSC has dealt with were C3-level attacks, typically confined to single organisations. These account for 451 incidents to date. The remaining 29 major incidents were C2-level attacks, significant attacks that typically require a cross-government response. Across these nearly 500 incidents, Noble said there were five common themes or lessons to be learned.


Learning To Wear Your Intelligence: How To Apply AI In Wearables and IoT

Kurzweil claims that machines will pass the Turing AI test by 2029, and that around 2045, “the pace of change will be so astonishingly quick that we won’t be able to keep up, unless we enhance our own intelligence by merging with the intelligent machines we are creating”. He further claims that humans will be a hybrid of biological and non-biological intelligence that becomes increasingly dominated by its non-biological component. Kurzweil envisions nanobots inside our bodies that fight against infections and cancer, replace organs, and improve memory and cognitive abilities. ... The artificial general intelligence (AGI) or strong AI community, though varying widely in timeframe to reach singularity, are in consensus that it’s plausible, with most mainstream AI researchers doubting that progress will be rapid.


A Tour of Recurrent Neural Network Algorithms for Deep Learning

Recurrent neural networks, or RNNs, are a type of artificial neural network that add additional weights to the network to create cycles in the network graph in an effort to maintain an internal state. The promise of adding state to neural networks is that they will be able to explicitly learn and exploit context in sequence prediction problems, such as problems with an order or temporal component. After reading this post, you will know: How top recurrent neural networks used for deep learning work, such as LSTMs, GRUs, and NTMs; How top RNNs relate to the broader study of recurrence in artificial neural networks; and How research in RNNs has led to state-of-the-art performance on a range of challenging problems.


Machine Learning, Artificial Intelligence - And The Future Of Accounting

When accounting software companies eliminated desktop support in favor of cloud-based services, accounting firms were forced to adapt to life in the cloud. Similarly, accounting departments and firms will be forced to adopt machine learning to remain competitive since machines can deliver real-time insights, enhance decision making and catapult efficiency. Rather than eliminate the human workforce in accounting firms, the humans will have new colleagues—machines—who will pair with them to provide more efficient and effective services to clients. Currently, there is no machine replacement for the emotional intelligence requirements of accounting work, but machines can learn to perform redundant, repeatable and oftentimes extremely time-consuming tasks.


Model-Based Software Engineering to Tame the IoT Jungle

The ThingML approach's first goal is to allow abstracting from the heterogeneous platforms and IoT devices to model the desired IoT system's architecture. In practice, platforms and devices, as well as the final distribution of software components, typically aren't known during the early design phases. The architecture model consists of components, ports, connectors, and asynchronous messages. Once the general architecture is defined, our approach allows for specification of the components' business logic in a platform-independent way using statecharts and the action language. ThingML statecharts include composites, parallel regions, and history states. The state machines typically react to events corresponding to incoming messages on a component's port.



Quote for the day:


"Integrity is the soul of leadership! Trust is the engine of leadership! " -- Amine A. Ayad


Daily Tech Digest - July 06, 2017

Connecting 400 million people to the IoT? No sweat.

The first wave of the roll out will connect nearly 2000 communities and covers an estimated 400 million people, making the project the first of its kind in terms of scale. In addition to successful trials in Mumbai, Delhi and Bangalore, the two firms have 35 proof-of-concept applications under trial over the network. Tata are aiming to speed up digital transformation in a variety of sectors including supply chain, utilities and healthcare. “The sheer size of this project is incredible, bringing new services to millions of people,” said David Sliter, vice president and general manager of the Communications Solutions Business at HPE. “Through our partner-centric approach, the HPE Universal IoT platform will enable Tata Communications to build multiple vertical use cases for its IoT network in India on a common platform with a common data model.”


Navigating the AI ethical minefield without getting blown up

As AI, big data, and the related fields of machine learning, deep learning, and computer vision/object recognition rise, buyers and sellers are rushing to include AI in everything, from enterprise CRM to national surveillance programmes. An example of the latter is the FBI’s scheme to record and analyse citizens’ tattoos in order to establish if people who have certain designs inked on their skin are likely to commit crimes*. Such projects should come with the label ‘Because we can’. In such a febrile environment, the risk is that the twin problems of confirmation bias in research and human prejudice in society become an automated pandemic: systems that are designed to tell people exactly what they want to hear; or software that perpetuates profound social problems.


How to manage vendors in a cloud-first world

The first is centralized, where everything -- including contract, performance, financial, risk and relationship management -- is handled centrally by vendor management and procurement professionals. This offers control and consistency, but can be inflexible and slow to react. Another common approach is decentralized, where vendor management is generally handled by individual business and IT departments, which has the benefit of being more responsive to issues as they arise but can have laxer and less consistent controls. The third, more recent approach, is hybrid, where vendor management functions are spread across the organization, according to where they fit best. In this scenario certain functions, such as contract management, could be handled centrally, while others, such as performance management, are carried out by individual departments or IT groups.


3 steps to create a digital banking relationship center

CIOs should help the business steer the transformation to a digitally enabled contact center. However, the work also involves developing an integrated platform that can help a new breed of “empathetic advisors” to serve customers. For instance, CIOs can help develop capabilities that allow intelligent machines to produce data insights that are shared with customer service agents so that they can deliver the most personalized services possible.  Right now, legacy technology hinders positive customer experiences. Data is piecemeal, there are gaps in knowledge, and silos make it near impossible to gain a comprehensive view of the customer. With an effective platform strategy, banks can break through the silos and deliver personalized customer experiences.


Use the cloud to achieve operating model transformation: Six steps

This also means that the cloud should be just one component of corporate IT's operating model transformation. Cloud can't be expected to deliver top-line digital results without software design and development strategies that connect it to digital business projects. This has spurred IT leaders' efforts to bring DevOps into enterprises and include the use of teams that coach developers to code for cloud-based architectures. It's not possible for cloud to deliver without new business engagement approaches that connect cloud adoption strategies to opportunities for scale and innovation. In one organization we studied, this led to the establishment of "cloud champions" -- commercial product managers who teach business unit leaders how to make use of cloud capabilities in their products for better commercial outcomes.


6 Robo Advisory Firms Trying Hard to Innovate

We’re often puzzled as to how people can use terminology to put an entirely new coat of paint on something that’s been around for a long time and then purport that it’s new and exciting. Take robo advisors as an example. In the good old days, you met with a wealth manager who asked you a bunch of questions and then plugged them into a piece of software that said where you should optimally invest your money. Usually, this is just a very simple asset allocation strategy with X% in bonds and Y% in stocks based on your age and appetite for risk. Then, as a result of the perpetual cost savings initiatives that permeate the corporate world, we decided to remove the human element, externalize the software, and call it a “robo advisor”, a term that somehow implies that there is more under the hood than there actually is.


Is your sandbox strategy keeping you safe?

To better detect APTs, security professionals are deploying advanced threat detection and protection technologies, often including virtual sandboxes which analyse the behaviour of suspicious files and uncover hidden, previously unknown malware. However, threats are getting smarter each day, and many vendors’ sandbox techniques simply have not kept pace. An APT is a set of stealthy and continuous computer hacking processes, often orchestrated by criminals targeting a specific entity. These threats often include unknown and undocumented malware, including zero-day threats. They are designed to be evolving, polymorphic and dynamic. They are targeted to extract or compromise sensitive data, including identity, access and control information. While these types of attacks are less common than automated or commoditised threats that are more broadly targeted, APTs pose a serious emerging threat.


Data Center Automation Advances Hybrid Cloud

A recent study by Morar Consulting concludes that it's difficult to realize savings in energy consumption and efficient operations without a data center infrastructure management system (DCIM). The implication of the study is that data centers have gotten so complex with so many invisible, moving parts, including virtual servers and containers, that average systems administrators/data center operations specialists need all the help they can get. Morar contacted 101 CIOs, CTOs and senior data center managers across the US for the study and another 100 in the UK. "A successful DCIM solution deployment allows managers to understand, manage and optimize the myriad amounts of data under their control, the Morar study said. The study was sponsored by Intel and Schneider Electric.


Get Ready for the Post-Cloud World

In the fast-approaching post-cloud world, the new differentiator is in how well an organization procures and consumes many digital business assets simultaneously, dynamically, and in deep synergy — across a diverse and complex supply chain and easily reconfigured installed base. Technology’s new poster child, artificial intelligence (AI), and its engine, machine learning, will be a major reason why cloud goes mundane, practically invisible, fairly soon. To make the most of a spectrum of apps and data hosting options, many variables need to be considered, evaluated, measured, reassessed and implemented. Repeat. All the time. To me, managing the complexity of such hybrid computing real-time procurement is a killer app of AI. These are not necessarily the skills resulting from completing a Microsoft certification process.


The Robot Automation Tipping Point

There are robots all around you, and you didn’t notice. From factory automation to self-driving cars, to drones, to the machines that make your pizza or how you paid for your parking. To the wild and fun automation that happen at a Disney resort. You can also build bots to analyze large amounts of data. Do you trade stocks? Well, 99% of the processing is happening on computers rather than brokers. That’s right even the trades are not completed by brokers anymore. People are going crazy over “Chat Bots” however it’s part of the automation tipping point. Humans love stories. We love reading books and engaging in conversation. The Message interface is simple and universal. This allows all ages and all walks of life to understand it, and use it. But what about all those people losing their jobs?



Quote for the day:


"In programming simplicity and clarity are not a dispensable luxury, but a crucial matter that decides between success and failure." -- Dijkstra


Daily Tech Digest - July 05, 2017

Seven tips for working with Office shapes

Shapes are drawing objects—lines, circles, rectangles, and so on—that you can use to enhance Office documents. You might add a simple line to distinguish your name and address in your resume. Or you might add a bit of pizazz to a marketing document. Shapes are available in Excel, Outlook, Word, and PowerPoint. You can enhance them using colors, patterns, borders, and other special effects. ... If you find yourself tweaking the same shape each time you enter it, stop. Instead, modify one shape and then set its properties to the shape's default properties. To do so, right-click the modified shape and choose Set As Default Shape from the shortcut menu. All subsequent shapes will exhibit your custom properties instead of the out-of-the box defaults.


Minerva protects endpoints with trickery and deception

The idea is that most normal threats will be blocked by traditional antivirus and Minerva will stop anything that attempts to get around that protection. In fact, Minerva officials stress that their toolset won’t protect anything without some type of antivirus first installed. It’s designed to work with any antivirus program, including Windows Defender and any of the offerings from companies like Symantec, McAfee, AVG, TrendMicro and others. The Minerva protection is installed as software, with the main interface and console running locally on a customer’s server or based within the cloud. Our test program was active on a physical server. Once installed, the program pushes agents out to every endpoint that needs to be protected. The agents are very lightweight, with each one taking up about 24 megabytes.


What are the impacts of facial recognition tech on society?

The perception was much different pre-September 11, 2001. This perceived type of futuristic technology was only something people saw in Hollywood and fell under the umbrella of Big Brother is watching us. At Super Bowl XXXV the federal government ran a test in which it scoured the 100,000 attendees and reported to have found 19 potential risks. This test was subsequently discovered by the media, leading to public conversation on privacy concerns. When questioned about the secret test, Tampa police spokesman Joe Durkin expressed, “It confirmed our suspicions that these types of criminals would be coming to the Super Bowl to try and prey on the public.” The dilemma, which in my opinion was the result of 9/11, becomes a conversation about improved security and the impact on our personal privacy.


Consumer IoT vs. Industrial IoT – What are the Differences?

Because IIoT systems can result in the generation of billions of datapoints, consideration also has to be afforded to the means of transmitting the information from the sensors to their final destination – usually an industrial control system such as a SCADA (supervisory control and data acquisition) platform. In order to not overwhelm these centralized systems with data, IIoT manufacturers are increasingly devising hardware that can carry out preliminary analytics directly at the device-level rather than on a program running in a cloud-based server (an emergent methodology known as edge computing or fog computing). Consumer IoT applications naturally tend to involve fewer devices and data points. Minimizing throughput to central servers is therefore less of a concern.


Big data is helping reinvent companies through digital transformation

Stratio brings together the latest, most disruptive technologies into a product that transforms businesses in-flight. Stratio DataCentric is designed to help Fortune 500 companies radically overhaul their information architecture in small, tactical steps and put data at the core of their business so that daily decisions and strategic planning can be taken based on powerful insight and well-grounded knowledge. With a lack of skills and inertia in organisational culture being major barriers to transformation, we provide both the technology and the skills that companies need. Our team of 300 staff accompany clients on their journey through transformation and we often establish joint-ventures. ... We have also developed a digital promotions platform that sends targeted deals to the customers of a global retailer in real-time.


8 Things Every Security Pro Should Know About GDPR

Companies that fail to comply with GDPR requirements can be fined between 2% and 4% of their annual global revenues or up to €20 million - which at current rates works out to just under $22.4 million USD - whichever is higher.  Enforcement of GDPR begins May 25, 2018. It replaces Data Protection Directive 95/46 EC, a 1995 statute governing the processing and protection of private data by companies within the EU. One of its biggest benefits for covered entities is that GDPR establishes a common data protection and privacy standard for all member nations within the EU. Organizations within the EU and elsewhere will still need to deal with data protection authorities in each of the 28 member countries. But they will no longer be subject to myriad different requirements from each member nation.


Alex Pinto on the intersection of threat hunting and automation

The environments where I've seen the most success with threat hunting utilized their incident response (IR) team for the task or built a threat hunting offshoot from their IR team. These team members were already very comfortable with handling incidents within the organization. They already understood the environment well, knew what to look for, and where they should be looking. IR teams may be able to spend some of their time proactively looking for things and formulating hypotheses of where there could be a blind spot or perhaps poorly configured tools, and then researching those potential problem areas. Documentation is key. By documenting everything, you build organizational knowledge and allow for consistency and measurement of success.


Operationalizing Data Governance: Actualizing Policies for Data Validation

Business are finally recognizing the value of data governance as an enterprise program, with the emergence of the Chief Data Officer (CDO) as a C-level role charged with ensuring that corporate data sets meet defined business data quality expectations. There are two aspects to this governance: defining policies, and operationalizing their compliance. One straightforward approach to data validation is to make it part and parcel of the application architecture: adjust the development methodologies to direct software designers to embed data validation as an integral component of their applications. Institutionalizing data validation within the organization’s application environment is predicated on standardizing an approach for defining data quality. Yet as is becoming more apparent, the definition of “data quality” is non-monolithic.


Less Is More For Canadian Quantum Computing

Researchers in Canada have found a way make a key building block for quantum computing from a custom photonics chip and off-the-shelf components intended for use in telecommunications equipment. They have built a chip that can create entangled pairs of multicolored photons. The result is that they can be manipulated as two "qudits," quantum computing digits, that can each hold 10 possible values. Where classical computers operate on values in sequence, quantum computers are able to express all possible values of a variable simultaneously, collapsing to the "right" answer at the end of the calculation. Not all computing problems benefit from this treatment, but it is particularly useful in the factorization of large numbers, necessary for cracking many forms of encryption.


How to use the mediator design pattern in C#

In the mediator design pattern, the objects don’t communicate with one another directly but through the mediator. When an object needs to communicate with another object or a set of objects, it transmits the message to the mediator. The mediator then transmits the message to each receiver object in a form that is understandable to it. By eliminating the direct communication between objects, the mediator design pattern promotes loose coupling. ... Note that the mediator design pattern differs from the facade design pattern. The mediator pattern facilitates how a set of objects interact, while the facade pattern simply provides a unified interface to a set of interfaces in the application. Thus the mediator pattern is a behavior pattern that deals with object behavior, the facade pattern is a structural pattern that deals with object composition.



Quote for the day:


Brooks' Law – "Adding manpower to a late software project makes it later."


Daily Tech Digest - July 04, 2017

The Internet of Things for banks: it’s here, it’s real, and it’s set to grow exponentially

Overall, with access to the unique data of individual homes, IoT has the potential to create personalised mortgages, factoring in both the property and the person. In fact, the financial sector can bring progress to the housing market with a whole range of technologies. The immediacy of transactions run on distributed ledger technology could verify house purchases more efficiently; processing time can be improved by big data stored in the cloud. But with the rise of data and connected devices, CIOs and CISOs are clearly under increasing pressure. Not only from the demands for increased scalability, as the sources of data presented by the rise of the IoT grows, but also building trust around security. So how can financial services firms manage and analyse the huge amount of data the IoT generates, while exploiting the flexibility of on-demand services, without compromising security?


The Fourth Industrial Revolution: Why you need a global IoT strategy

Get past the pilot stage of IoT that many organizations seem to be stuck in. Even if you are just trialing IoT technology, you should already know the next step you will take if the trial works. For instance, if you are trialing sensors to track goods flows through your warehouse, your next step might be to place sensors on goods and on trucks so you can track transport. Your end goal might be 360-degree visibility of all goods in your supply chain from point of origin to point of shipment. Adopt IoT technologies that can interoperate with each other. There are still incompatibilities between IoT products from different vendors. Find vendors that participate in standards groups that promote device interoperability. Keep your focus on security. Botnets like Mirai will continue to attack IoT. Your security should be regularly updated so you have the best possible protection.


Should organisations be switching their certificate authority?

To ensure the security of your cryptographic assets, agility is key. Let’s say your CA is compromised by a cyber attack and your certificates from that CA move to an untrusted state. First, you have to be able to locate all impacted certificates. You’ll then need to reissue certificates from another CA. Which CA’s management console will you use to complete this arduous task? Can you expect any CA to provide the functionality that helps you move certificates to a competing CA? Granted, a compromise is probably the worst-case scenario for switching CAs. However, there may be other cases which are less dramatic, but are still as important to address. Let’s say the employees at your CA make an (all too human) mistake such as mis-coding a batch of certificates or accidentally revoking a root certificate. You’re basically left in the same situation; you experience a service outage.


Get started with the Windows Subsystem for Linux

Technically, WSL is for console-only applications, providing shell support for developer tools and remote access to Linux servers running on-premises and in the public cloud. But it’s turned out to be a lot more flexible, and although this is not officially supported, users have installed and run X-based GUI applications, using Windows X Servers to bring a full Linux desktop experience to WSL. Working with any of the WSL personalities is like working with native Linux. You’ve got access to a shell, and through it the command line. Installing applications is as simple as using apt-get on Ubuntu or yast and zypper on Suse. When Fedora makes its way to Windows, you’ll use yum. Early WSL builds had problems running some applications, because the key dependencies weren’t supported. But since the Windows 10 Anniversary Update release, it’s been a lot easier, and now even complex packages like Docker install and run.


The Hidden Pitfalls Of Going Freelance In IT

“This is exacerbated in the IT world, because more often than not, you are going to be working remotely,” says Brattoli, who’s been freelancing on and off for his entire IT career. “Technology is wonderful in that it makes it possible for us to work from anywhere with an Internet connection. But there is still value in being able to meet face-to-face, and many companies are hesitant to trust someone they haven’t met.” In addition, at many companies the tech-savvy people running a project will know what needs to be done to meet the desired outcomes. “But once that’s all figured out, it is very hard to convince the people above them to go through with it,” Brattoli says. “Where technology is concerned, people who are less tech-savvy are going to be wary of any new changes to infrastructure.”


Analysis: Top Health Data Breaches So Far in 2017

As of July 3, 149 breaches affecting a total of nearly 2.7 million individuals have been reported to federal regulators so far in 2017, according to the Department of Health and Human Services' so-called "wall of shame" website of breaches affecting 500 or more individuals. Of those 2017 breaches, 53 are listed as hacking/IT incidents. And although they only represent about one-third of the breaches reported in 2017, those incidents are responsible for affecting 1.6 million individuals, or about 60 percent of the victims impacted. Those incidents include a ransomware attack reported to HHS on June 16 by Airway Oxygen, a Michigan-based provider of oxygen therapy and home medical equipment. That incident is listed on the federal tally as affecting 500,000 individuals, making it the second largest health data breach posted so far this year.


10 tips for mastering PowerPoint

With an estimated 500 million users worldwide, PowerPoint remains a key presentation tool in many enterprises, with Microsoft recently adding collaboration tools to further enhance its utility in the workplace. However, the platform includes many features that often fly under the radar, but can give your slide decks a boost. "We all know how easy it is to create and deliver a bad, mind-numbing presentation," wrote TechRepublic feature editor Jody Gilbert. "Fortunately for both presenters and their hapless victims, various add-ons are available to make presentations more functional and compelling." Here are 10 popular TechRepublic articles with tips for becoming a Microsoft PowerPoint expert and getting the most out of the presentation program.


The Growing Role of Machine Learning in Monitoring

While the adoption of machine learning in DevOps is relatively slow compared to other industries, the potential is huge. To start understanding what has to gain from this rapidly developing field, one needs only to look at the world of monitoring and log analysis, where machine learning can be used to alleviate some of the main pain points experienced by DevOps teams — namely, the analysis of vast volumes of data and the extraction of actionable insights from this data. Based on the monitoring solutions on show at Monitorama this year, I can safely claim that in this space at least, the machine learning revolution is well under way. ... Using a combination of supervised and unsupervised machine learning algorithms, Moogsoft promises to decrease the signal-to-noise ratio of alerts and correlating those alerts across your toolsets in real-time.


Make room for AI applications in the data center architecture

"If you look at the deep learning algorithms, they're extremely communication-intensive," Dekate said. "Trying to architect solutions for such a chatty application stack is going to be very hard for organizations to get their heads around." As data center networking architects prep their infrastructures for AI they must prioritize scalability, which will require high-bandwidth, low-latency networks and innovative architectures, such as InfiniBand or Omni-Path. The key is to keep all options open for automation, Perry said. The market is quickly maturing with automated data center infrastructure management technologies, a sign that automation is becoming more widely accepted in data centers. "Once more automation features are in place … this will help set the stage for the introduction of AI," Perry said.


Digitisation to transform the UK’s criminal justice system

Digitisation provides the opportunity to re-build the processes of the justice system around the citizen. Pilot initiatives such as the digital case file and online plea submissions have begun to prove the concept in practice, showing how digitisation can increase access to justice whilst reducing costs, streamlining processes and improving quality. Liz Crowhurst, policy officer, The Police Foundation and the report’s author, said: “At a time when justice agencies are under pressure to reduce costs, even as the complexity of cases increases, digitisation offers significant opportunities to radically improve services while increasing cost-efficiency and transparency. This, in turn, will deliver improved outcomes for victims, witnesses, defendants and offenders.”



Quote for the day:


"Next Industry to Embrace Blockchain is Aerospace" Accenture - CoinTelegraph