Daily Tech Digest - November 06, 2017

Google can read your corporate data. Are you OK with that?
The big concern from enterprises this week was not being locked out of Google Docs for a time but the fact that Google was scanning documents and other files. Even though this is spelled out in the terms of service, it’s uncomfortably Big Brother-ish, and raises anew questions about how confidential and secure corporate information really is in the cloud.  So, do SaaS, IaaS, and PaaS providers make it their business to go through your data? If you read their privacy policies (as I have), the good news is that most don’t seem to. But have you actually read through them to know who, like Google, does have the right to scan and act on your data? Most enterprises do a good legal review for enterprise-level agreements, but much of the use of cloud services is by individuals or departments who don’t get such IT or legal review.


How microservices governance has evolved from SOA


Governance with monoliths is centralized. Decisions are made top-down, and rigid control is maintained to ensure uniformity across the organization and the application stack. Over time, this model degenerates, creates a system that becomes technologically and architecturally stagnant and slows down the pace of innovation. Teams are forced to merely conform to the set order of things rather than look for new, creative solutions to problems. For microservices governance, a decentralized model works best. Just as the application itself is broken down into numerous interdependent services, large, siloed teams are broken down into small, multifunctional teams. This follows the progression from development, testing and IT teams morphing into smaller DevOps teams.



5 cyber threats every security leader must know about

The first is Consumer IoT. These are the devices we are most familiar with, such as smartphones, watches, appliances, and entertainment systems. Users insist on connecting many of these to their business networks to check e-mail and sync calendars, while also browsing the Internet and checking on how many steps they have taken in the day. The list of both work and leisure activities these devices can accomplish continues to increase, and the crossover between these two areas presents increasing challenges to IT security teams. ... The cloud is transforming how business is conducted. Over the next few years, as much as 92 percent of IT workloads will be processed by cloud data centers, with the remaining 8 percent continuing to be processed in traditional on-premises data centers.


Inside-Out: How IoT Changes Everything


"Design thinking is a way to place the user at the heart of the innovation process," he said. "Our company strategy is really that innovation is not coming from startups or technologies, but from the end users and the customer observation. It's really focused on the end user. We are working, for example, with ethnologists and psychologists to understand the problems and to describe the problems. It's really important for us." Celier explained that VISEO created specialized innovation centers as part of their One Roof program. The idea is to bring clients into their production studios, much like filmmakers bring all the talent into a studio for producing movies. "We are incubating our customer's project in our building. It's a way to go faster. They come with their vision, their idea, and they leave with a platform or product," he said.


Cybersecurity thwarts productivity and innovation, report says


The top priority of most organizations — cybersecurity — is hindering productivity and innovation, according to a recent report by Silicon Valley-based virtualization firm Bromium. Based on a survey of 500 chief information security officers in large organizations in the U.S., U.K. and Germany, 74 percent of respondents said end users were frustrated by how security requirements disrupt operations. "Our research found, on average, an organization gets complaints from users twice a week saying that legitimate work activity is being blocked or rejected by over-zealous security systems," the report reads. Citing that most — 88 percent — of organizations use a prohibition approach to cybersecurity, the firm suggests "a new approach" that allows more technological innovation within the organization.


Securing Smart Homes

“The industry is starting to get educated about the need for [better security],” Dirvin says. “Now they ask more questions about it and are willing to spend more time and effort,” but not always money. Manufacturers of smart home devices typically haven’t had to think about security in the same way as a medical device maker or a manufacturer of industrial automation. “It’s a whole new area for them, so they’re rushing to build connectivity and incorporate these devices into a broader IoT strategy,” says Warren Kurisu, director of product management in the embedded systems division at Mentor, a Siemens business. “The security, from a software perspective, is something they’re just now starting to realize that they need to do.” This is especially true in the wake of the Mirai attack. The number of connected devices is expected to reach 20.4 billion by 2020, according to Gartner.


Was BadRabbit a distraction? Malware 'used to cover up smaller phishing attacks'

Ransomware attack
"There is an open, let's say instantly obvious attack, while underneath there is a hidden, fairly well-thought-out attack, to which nobody pays attention," police chief Serhiy Demedyuk told attendees while speaking at the Reuters Cyber Security Summit in Kiev. "During these attacks, we repeatedly detected more powerful, quiet attacks that were aimed at obtaining financial and confidential information." He said the so-called "hybrid attack" – meaning a multi-pronged assault – was also found to be targeting users of a popular form of Russian accounting software called 1C. "The main theory we're working on now is that they [the hackers in both attacks] were one and the same," Demedyuk added. "The goal was to get remote and undetected access."


The Internet of Things is about much more than just connecting devices

The connected nature towards which we are migrating will allow manufacturers to better understand what their customers require on a real-time basis. This in turn enables the manufacturer to recalibrate not only the actual manufacturing part of the business and what they procure, but also to become highly competitive, super in-tune with what their customer requirements are, down to quality requirements per customer. That transparency will drive product improvement and customer satisfaction to new levels. Manufacturers will not order more raw material than they need. Think about latency and how this will be addressed. Consider this example: a customer wants a product; there’s the procurement of materials, import, export, shipping, logistics, manufacturing – it can take up to six months or more.


7 habits of highly effective digital transformations

7 habits of highly effective digital transformations
The collaborative efforts have paid off. “As a result of sharing practices, we have identified cases where we see a common failure mode in our continuous integration, delivery and operational practices — and then we are able to propagate the fix across all teams and improve and correct across all teams,’’ Fairweather says. Management also conducted a survey of its strategic foundational technology program. Fairweather recalls one comment an employee gave as feedback: “Instead of being a cog in the wheel I’m a better-informed contributor. The best part of learning from peers is gaining new contacts. We are more united as global organization in pursuing these 10 areas because we had done this.’’ ... As organizations get larger, different groups can begin to cut themselves off from one another, creating silos of information, he says.


6 Steps Up: From Zero to Data Science for the Enterprise

Different stakeholders have different views about the desire for a Customer360, but perhaps the most clarifying is that for a company to truly drive value and delight its customers, the business must understand those customers and approach every question from their perspective. Without a Customer360 built on a foundation of data science, the business will only ever have a qualitative view of customers. I believe a true, quantitative understanding of customers relies on rigorous data science. Less attention has been paid to the concept of a Product360, but it's no less important. Depending on the business, a Product360 can potentially drive more value through cost savings and cost avoidance than the business can derive from new revenue. The ultimate goal of a Product360 is creating assets that allow the business to explore each product from earliest inception through the end of its lifecycle.




Quote for the day:

"Instinct is intelligence incapable of self-consciousness." -- John Sterling


Daily Tech Digest - November 05, 2017

The end of the cloud is coming

An internet powered largely by client-server protocols (like HTTP) and security based on trust in a central authority (like TLS), is flawed and causes problems that are fundamentally either really hard or impossible to solve. It’s time to look for something better — a model where nobody else is storing your personal data, large media files are spread across the entire network, and the whole system is entirely peer-to-peer and serverless (and I don’t mean “serverless” in the cloud-hosted sense here, I mean literally no servers). ... Peer-to-peer web technologies are aiming to replace the building blocks of the web we know with protocols and strategies that solve most of the problems I’ve outlined above. Their goal is a completely distributed, permanent, redundant data storage, where each participating client in the network is storing copies of some of the data available in it.

Europe’s businesses leaving workers behind in the technology skills race
Human-centred interactions between people and machines have profound implications on the design of products and services. No longer do consumers need to command machines using a graphical interface: voice interfaces such as Alexa, Siri and Cortana etc. have changed that. Next, the emphasis will shift from understanding the meaning to interpreting intent. For example, in Toyota’s Concept-i car instead of commanding its virtual AI assistant, Yui, to turn the AC up, Yui will be able to understand intent in statements like “I’m feeling a bit cold.” It isn’t necessary to look into the future to see this trend. Already data-driven products are taking on board the emotional reactions of their users. For that reason, the best data-driven services don’t exhaust the user with endless data-gathering questions: Apple Music asks new users to “Tell us what you’re into” and presents a few bubbles containing genres to select.


Blockchain Aims to Foster Payer, Provider Trust for Value-Based Care

Blockchain and value-based care
Value-based care has accelerated the need for seamless data sharing in an environment that is both transparent and unquestionably trustworthy – one that can bring payers and providers together to improve quality, reduce costs, and enhance the patient experience. While stakeholders have offered up plenty of potential solutions for creating a free-flowing data environment that can support the complex environment of pay-for-performance reimbursements, blockchain may be the methodology that ticks the most boxes with a relatively low amount of effort. At Hashed Health, an industry consortium dedicated to applying blockchain to real-world use cases, CEO John Bass believes that the distributed ledger approach offers a number of promising improvements to the way providers, payers, and patients collaborate in a value-based world.


Future of Digital Currency May Not Involve Blockchains

Future of Digital Currency May Not Involve Blockchains
The problem with cryptocurrencies conceived before Bitcoin was their centralized structure. Without Blockchain technology, there was no “decentralized, immutable, transparent” ledger in which transactions could be recorded, leading to a centralization. Yet it looks like Blockchain may not be the be-all, end-all of digital currency technologies. Recently, a new form of crypto has emerged that leverages the Directed Acyclic Graph (DAG) organizational model for the structure of its decentralized ledger, allowing old problems to be solved and new features to be added. Today, we’re going to take a look at the technology that can potentially replace the Blockchain itself and some of its current implementations. Although the implementations that we are going to discuss today are new, the concept is not.


Do More With Machine Learning Thanks To These 6 Open Source Tools

Machine Learning Open-Source Tools
One problem the industry is seeing, however, is that there’s a severe lack of developers and new talent. It’s a problem for the entire development and programming industry, not just machine learning. Many companies and brands are vying for new employees, leaving the startups and newer names in a bit of hot water. Luckily, this can be offset by adopting open-source development protocols. More importantly, you can open your projects — future and present — to an even broader development community and audience by making it open-source. Open-source tools allow anyone to contribute to a project and work on fixes for bugs, new features and new builds. You can retcon separate versions, selecting the content and elements you want in an official release. This way, even though there’s a development community behind the project, you still have a great deal of control over the central project path.


The road to artificial intelligence in mobility—smart moves required

This overall interest in what AI could enable in automotive and mobility technology leads to a considerable willingness to pay for those features. Of the consumers who indicated high interest in AD features (24 percent of those surveyed), 46 percent are willing to pay more than $4,000 for autonomous-driving features on their next car. And AD features are so important to consumers that 65 percent would switch OEMs for better AD functionality; that figure exceeds 90 percent for young consumers and those living in large cities. Expectations are high, though, and may need to be tempered. On average, consumers expect full autonomy to be widespread in about five years—a tight timeline for any player, and for regulators. Machine learning will have a significant impact on the automotive and mobility industry, since it will unlock entirely new products and value pools and not just lead to productivity improvements.


Blockchain’s explosive growth pushes job skills demand to No. 2 spot

FinTech - financial technology - blockchain network - distributed ledger wireframe
It's not hard to imagine blockchain as a "disruptive skill" that is both fast-growing and hard to find, according to Burning Glass Technologies. While the technology and hiring patterns are in their early stages, it might be a good idea for employers to start figuring out where they will find blockchain talent, "even as they are still considering how the technology will change their business. "Because of its connection with 'cryptocurrencies,' blockchain is associated with finance, and major banks like Liberty Mutual, Capital One and Bank of America have posted openings," Burning Glass Technologies said in its blog. "There are also companies devoted to building blockchain applications, like Consensys Corporation. But the demand for blockchain is much broader, including major consulting firms like Accenture and Deloitte and technology companies like IBM and SAP. ..."


There's a Lot More to AI Than Just Chatbots

There's a Lot More to AI Than Just Chatbots
Options, where the AI uses data to create a model, but does not integrate with the DMP, are okay and will deliver enhanced business results. But they will never be as powerful as a truly integrated system. Artificial intelligence perceives its environment and makes decisions which will maximize its chance of success at any given goal. This could range from optimizing profit margin, to maximizing stock efficiencies. For example, a supermarket will want to ensure it always has enough salad in stock to supply its customers, while making sure there is minimal wastage and minimal unsold product. A good AI system can take that supermarket's typical sales into account, but should also be linked to weather information, so if there is a freak heatwave in October, the weather, and not just October's average salad sales, will be considered.


Microservices Interaction and Governance model - Orchestration v Choreography


In order to understand the options for managing microservice interaction, we should first study its history. Let’s look back to a time that is almost a decade before microservices really took off. In the early 2000s, the book Enterprise Integration Patterns was published. The corresponding web site for EIP remains an important reference for service interaction even to this day. Workflow engines were a popular option back in the days of Service Oriented Architecture, Business Process Management, and the Enterprise Service Bus. The promise at that time was that you could create orchestration style APIs without needing to employ a fully trained engineer. They are still around but there isn’t much interest in this option for microservice interaction anymore, primarily because they could not deliver on that promise.


The Fear of Disruption Can Be More Damaging than Actual Disruption


The automotive industry is at the start of just such a period. Massive changes appear to be inevitable: connected cars, autonomous vehicles, battery breakthroughs, and the like. But these changes will probably take decades to be fully adopted. The vehicles themselves have been in development for years now, and their potential impact has been analyzed extensively through computer models. Many critical factors will slow down their adoption. These include technical factors, such as the difficulty of designing vehicles for a wide variety of terrains and climate conditions. Incumbent automakers have built up fundamental advantages in design, manufacturing, distribution, sales, and financing, making it hard for new entrants to compete. All manufacturers, old and new, will need time to ramp up so they can produce the necessary technologies at scale. The transition will also require new types of auto repair shops, new fleet-management companies with new sources of capital for financing them, new forms of auto insurance, and new traffic and safety regulations.



Quote for the day:


"Don't wait. The time will never be just right." -- Napoleon Hill


Daily Tech Digest - November 04, 2017

Into the Core of REST


To uncover the hidden nature of Representational state transfer style, let’s dissect its name backwards. The word transfer implies that there are at least two processes communicating through some medium which implies a distributed system. The word state means that one process of a distributed system transfers its internal "view of the (surrounding) world" to another process. This ‘internal view of the world" is all the relevant information required by the process to do its duty (see Figure 1). It contains both information gathered from the environment and the one generated internally and is expressed by nouns. The word representational means that processes do not literally send their "internal views of the world" but encode them into descriptions (representations) understandable by recipients. Representation hides the internal nature (implementation) of the processes internal state.




There are no written test cases for the above test types since they are techniques that are based on the experience of each tester to test the system. However, one certainty is that we often write test cases for test types called functional testing and smoke testing in which we apply the test case design techniques, such as equivalent partitioning, boundary analysis, constraint analysis, state transition and condition combination, to design test cases. ... We combined all type tests, such as exploratory testing/ad-hoc testing, error guessing, stochastic testing, functional testing and smoke testing, during the testing phase to make sure we had maximum test coverage. ... We cannot apply automation testing for AI since it is just useful for stable systems with written test cases. Whereas AI behaviors are very complicated and random, so AI testing is more suited for manual execution.


C# BAD PRACTICES: Learn How to Make a Good Code by Bad Example – Part 5

This article is about Open Closed Principle (OCP) in SOLID principles and you don't have to read my previous articles to understand it, but I encourage you to do so. :) My motivation of writing this article is the fact that there is a huge confusion around this principle and many different interpretations of it. This principle was confusing to myself as well and that's why I went deep inside this topic and will present my finding and thoughts about it here. In my opinion, it is besides the Liskov Substitution Principle, the most difficult one (to fully understand) from SOLID. From my experience, I can say that it is confusing, even senior engineers and most developers know only a definition of it without a deep understanding of why and when it is useful. This may lead to blindly applying this rule which can make the code base bizarre.


AI: How big a deal is Google's latest AlphaGo breakthrough?


"AlphaGo is an interesting computer science accomplishment, this is algorithm development. [But] I don't think it is necessarily a big meaningful step," he said. "It does allow you to explore a whole bunch of things, related AI algorithms, what are called reinforcement AI algorithms and so on, in that sense it does contribute to the whole thing. "But when it comes to real-world applications in enterprises, I'm not sure AlphaGo makes by itself a significant difference." From Microsoft's perspective, he says that pursuing research that will make it easier for people to chat to computers using text or speech will really transform what's possible with AI. "Really solving every language in every kind of context, being able to create conversational applications and doing so really well, I think that's an incredibly important part of AI innovation, because no matter what, the vast majority of high-value interactions in this world happen using language."


Microsoft quietly announces end of last free Windows 10 upgrade offer

windows-10-assistive-tech-offer.jpg
Part of the stated justification for the original exception was the fact that Microsoft was still working on accessibility options for Windows 10, with a specific call-out to changes scheduled to arrive as part of the July 2016 Anniversary Update. There have been two feature updates since then, and the Anniversary Update is now the oldest supported Windows 10 version on the market. ... Corporations that have planned their upgrades to Windows 10 aren't making budgets based on this loophole. Individuals and small businesses that have said no to the upgrade for more than two years are hanging on to the original operating system on older hardware by choice. One practical question is whether Microsoft plans to tighten its activation code and start rejecting the automatic issuance of a digital license for Windows 10 when upgrading from Windows 7 or Windows 8.1 on older hardware.


11 Simple Java Performance Tuning Tips

Most developers expect that performance optimization is a complicated topic that requires a lot of experience and knowledge. Okay, that’s not entirely wrong. Optimizing an application to get the best performance possible isn’t an easy task. But that doesn’t mean that you can’t do anything if you haven’t acquired that knowledge. There are several easy to follow recommendations and best practices which help you to create a well-performing application. Most of these recommendations are Java-specific. But there are also several language-independent ones, which you can apply to all applications and programming languages. Let’s talk about some of these generic ones before we get to the Java-specific performance tuning tips.


Car Autonomy Levels Explained


The levels of autonomy are a progression of self-driving features that engineering experts SAE International have outlined. These levels range from no self-driving features at all through fully-autonomous driving. ... It's important to note that today, right now, the highest level of autonomy available to us is Level 3—not full autonomy, or even high autonomy, no matter what marketing materials or other automotive publications say. No autonomous car currently exists that can be trusted with the full autonomy of dynamic driving tasks. Audi AI can take over sometimes, under certain conditions, but even Audi AI requires the driver to take over once the system's limitations are exceeded. Audi has correctly dialed back its earlier claims that "The driver no longer needs to monitor the car permanently." Even the press release we criticized last July no longer contains this misleading statement.


The biggest headache in machine learning? Cleaning dirty data off the spreadsheets

“There's the joke that 80 percent of data science is cleaning the data and 20 percent is complaining about cleaning the data,” Kaggle founder and CEO Anthony Goldbloom told The Verge over email. “In reality, it really varies. But data cleaning is a much higher proportion of data science than an outsider would expect. Actually training models is typically a relatively small proportion (less than 10 percent) of what a machine learner or data scientist does.” Kaggle itself is intended to help. The site is best known for its competitions, where companies posts a specific data-related challenge and then pay the person who comes up with the best solution. And this means Kaggle has also become a repository of interesting datasets that users can play around with. These range from a collection of 22,000 graded high school essays to CT scans for lung cancer to a whole lot of pictures of fish.


Why security in microservices continues to fall short

The microservices world has made things very complex for security individuals in organizations. But it's also made it very difficult for QA testing and DevOps [teams] because it has taken some of the complexity and pushed it down to a DevOps space that didn't exist before. So to me, when people talked about security from an API or a microservices perspective, very often, what they're focusing on is the security of the container or the configuration management tool. So the guys are talking about something about Chef's container configuration management tool or Tenable's patch management tool for those containers as well. All of that is great. But what they're not focusing on is the fact that the way the software is being developed itself is completely different. So, let me give you a few examples of how software development and QA processes haven't caught up to deal with the microservices world.


Asset & Wealth Management Revolution: Embracing Exponential Change 

The AWM industry is a digital technology laggard. Technology advances will drive quantum change across the value chain – including new client acquisition, customisation of investment advice, research and portfolio management, middle and back office processes, distribution and client engagement. How well firms embrace technology will help to determine which prosper in the years ahead. Technology giants will enter the sector, flexing their data analytics and distribution muscle. The race is on ... Things will look very different in five to ten years’ time. Fewer firms will manage far more assets significantly more cheaply. Technology will be vital across the business. And, some firms will have discovered new opportunities to create alpha, and restore margins. With change accelerating, all firms must decide how they will compete in tomorrow’s world.



Quote for the day:



"Most successful entrepreneurs are trying to solve real-world problems that they encountered over years of working for someone else." -- Dan Simon


Daily Tech Digest - November 03, 2017

artificial intelligence / machine learning / binary code / virtual brain
Transforming the organization into a cognitive enterprise will be an arduous task and an evolutionary process. Jobs will not disappear overnight, and many organizations will outright fail to leverage the power of this technology — and will suffer the business consequences as a result. This lack of inevitability is because there are two significant problems when it comes to leveraging machine learning in the enterprise: data and bias. Machine learning only works with data. Lots and lots of data. It’s called machine learning because the machine must be ‘taught’ by giving it data from which it can distill patterns, and, in most cases, the teaching data must be ‘clean’ — meaning that it must be accurate and represent the desired outcomes. This reality means that for machine learning to work, an organization must begin with lots and lots of good, clean data. 


How to select the best self-service BI tool for your business
If most of your data is on Azure, you might want to rule out BI systems that run only on Amazon Web Services, and vice versa. If possible, you want the data and the analysis to be collocated for performance reasons. Vendors tend to cite analyst reports that are most favorable to their product. Don't trust the vendor's skimmed abstract or take the diagram they show you at face value: Ask for and read the whole report, which will mention cautions and weaknesses as well as strengths and features. Also take the fact of inclusion in an analyst's report with a large grain of salt: ... Some BI platforms now use in-memory databases and parallelism to accelerate queries. In the future, you may see more highly parallelized GPU-based databases built into BI services — third parties are building these, demonstrating impressive speedups.


Where is my data!? Why GDPR is good for Mainframes

The implications for the mainframe and GDPR are vast. The increased use of mobile devices alone are driving exponential growth in transaction volumes, and that data contains massive amounts of PII. This personal data is spread across the organization, widely used, transformed and accessed in different ways by different people, meaning application-based controls are not enough for complying with the regulation. The key first step toward achieving GDPR compliance for mainframe data is beginning with the identification and classification of the data, and determining which data contains PII information. Based on that classification, you will have a view of what personal data is being stored and where, and therefore a view at the levels of risk in your organization. If personal data is circulating outside the assigned channels and flows, it’s important to understand why and assess the associated risk to that data.


Tapping into big data’s potential

Tapping into big data’s potential
With big data you have different aspects, and there is relevance to how central banks deal with the data in general. When you look into the responses to the survey, they clearly show that, although it is unstructured data as far as the research is concerned, it could be structured and voluminous for other purposes – such as the credit register. I think there is a question about what the data is used for, and not so much the size or the structured versus unstructured demarcation. ... Firstly, there are those who say big data is primarily the type of unstructured data the private sector is dealing with. According to a recent BIS review, central banks are clearly interested too, for example, in looking at internet searches for nowcasting. A second area that is really key for central banks is in dealing with very large administrative and financial datasets. It is not simply because it is large that makes it big data, but because it is large and complex.


Facebook's plan to throw humans at security, ... equates to indictment on AI progress

For Facebook, the crisis isn't due to Russians tinkering with election sentiment. The crisis for Facebook is trust. You are the product. If you don't trust Facebook's information you may not engage as much. Facebook needs you to pass along information. The fact that there is shock -- shock I tell you -- over how Facebook can be used to manipulate the masses is almost comical. After all, those tools are the same reason marketers are fueling Facebook's financial dominance over the ad industry. But this rant isn't an indictment of social media lemmings or Facebook's controls or approach to ads. The Facebook conference call -- and Zuckerberg's solution to double headcount on security and throw humans at the fake news and trust issue -- is really an indictment on its AI prowess. Facebook simply doesn't have the tools or AI engineering to automate its way out of its mess.


Stratis: Blockchain-as-a-Service (BaaS)


Stratis is a flexible and powerful Blockchain Development Platform designed for the needs of real-world financial services businesses and other organizations that want to access the benefits of Blockchain technologies without the overheads inherent in running their own network infrastructure. ... Stratis is designed with the integration of fiat gateways in mind from the outset. It allows financial organizations to use the blockchain for the transfer of existing currencies that are both readily accepted by mainstream consumers and are not subject to damaging volatility: tokens of value that are simply digital equivalents of regular money. This ‘best of both worlds’ approach means that businesses can maintain compliance in whatever way they see fit, according to jurisdiction and organisational policy, while simultaneously using the blockchain as a store of value


The Future of Cybersecurity Part II: The Need for Automation

istock 166419812
Threats are evolving so quickly on the black hat side that the only way to combat them is through automated and intelligent defense layers that can quickly identify new and existing threats and then make decisions to mitigate them. I call this type of cybersecurity defense “actionable intelligence.” It requires deploying interconnected security solutions everywhere across your expanded network, including deep into the cloud, The goal is to create a security solution that is able to see and identify the stages of a threat and then make a decision on its own. Such an expert system is able to identify and block attacks at network speeds so that we don’t have to rely on humans, who often miss too much and respond far too slowly, to take action. This may require rethinking – and even retooling – your security infrastructure. To start, devices need to be able to see each other and share threat intelligence.


Data lake and data warehouse – know the difference

If you’re still struggling with the notion of a data lake, then maybe the following analogy will clarify matters. Think of a data mart or data warehouse as a storage facility rife with cases of bottled water. Those cases didn’t just magically appear overnight. People and machines gathered and purified the water. After packaging it, only then was it ready for people to buy and drink. By comparison, think of a data lake as a large body of natural water that you would only drink if you were dying of thirst. If you need 50 gallons of water to put out a fire, you don’t need to buy cases of bottled water and empty them out one by one. It’s all there, ready to go. In keeping with this analogy, the “water” in a data lake flows from many places: rivers, tributaries and waterfalls. That is, the data lake doesn’t hold only one type of water (that is, data). Data lakes can house all types of data: structured, semistructured and unstructured.


Blockchain Technology and The Changing Global Economy at the Ethereal Summit

Ethereal Summit 3
There are many parallels between the adoption of blockchain technology in emerging markets and the mainstream adoption of telecommunication in the 21st century. Instead of using phone lines, developing countries utilized newer technology and developed their infrastructure using satellite wireless communication. By "piggybacking" on the cell technology of developed countries, developing countries were able to incorporate new technology in an efficient and cost-effective way. Similarly, countries with fewer established financial systems are taking advantage of decentralized financial institutions powered by blockchain technology instead of establishing traditional banks. Although implementation speeds will vary by country, blockchain technology has the potential to empower all markets, including those looking for a technological piggyback.


What Is "Cloud-native" Data and Why Does It Matter?


Be aware that in cloud-native systems, the unified log often becomes the system of record. Materialized views show you a representation of that data for a given purpose. This is different way of thinking of data storage, and for many, turns the idea of a database inside out! The unified log holds individual transactions from your various inputs. Those items may inflate into objects or records in your applications or cache. This may be a new way for you to store data, but it’s proven to be an excellent solution at scale. That said, you don't have to throw out your trusty relational database. Instead, reassess how you use it. For example, if you've been using your relational database for application session state, consider introducing something like Redis and get familiar with key-value stores. At the same time, introduce modern relational databases like Google Cloud Spanner that are designed for geographic resilience and cloud-scale performance on demand.



Quote for the day:


"If you are filled with pride then you'll have no room for wisdom." -- African Proverb


Daily Tech Digest - November 02, 2017

regulating-artificial-intelligence-robots-head-close-up-3d-ai-illustration
It may seem reasonable to worry about researchers developing very advanced artificial intelligence systems that can operate entirely outside human control. A common thought experiment deals with a self-driving car forced to make a decision about whether to run over a child who just stepped into the road or veer off into a guardrail, injuring the car’s occupants and perhaps even those in another vehicle. Musk and Hawking, among others, worry that a hyper-capable AI system, no longer limited to a single set of tasks like controlling a self-driving car, might decide it doesn’t need humans anymore. It might even look at human stewardship of the planet, the interpersonal conflicts, theft, fraud, and frequent wars, and decide that the world would be better without people. Science fiction author Isaac Asimov tried to address this potential by proposing three laws limiting robot decision-making


xaas-intro-main.jpg
The fundamental benefits of the 'as a service' model are well known, and include: a shift from capital to operational expenditure (capex to opex), often leading to lower TCO (total cost of ownership); access for businesses of all sizes to up-to-date technology, maintained by service providers that can leverage economies of scale; scalability according to business requirements; fast implementation times for new applications and business processes; freeing up staff and resources for other projects and priorities. Of course there are potential downsides to 'as-a-service' adoption, which include: service outages; security, governance and compliance issues; inadequate performance; hidden costs; service provider lock-in; and customer support issues. Most of these potential problems can be minimised with good planning and a tightly-defined SLA ...


Microsoft Open Sources Java Debugger for Visual Studio Code Editor

Using the VS Code Java Debugger
This week, while noting the move to open source, Microsoft also announced the open sourcing of the Java Debug Server that provides support on the back-end. "Since we first released our Java Debugger extension for Visual Studio Code on Sept. 28, it quickly became the most trending extension of the month," said Xiaokai He, program manager, Java Tools and Services, in a blog post. "And of course, lots of feedback and suggestions were submitted from our active developer community. ..." The two main improvements to the debugger mentioned by He are automatic resolution of a project's main class, so developers don't have to explicitly specify it anymore, and fully supported external source files. Speaking of the latter, He said, "With this feature, you can now also debug with third-party classes, when the source code is inside a JAR or a source attachment. And you can also set breakpoint in those classes ahead of debugging."


VHPC is on the rise, but comes with its own challenges

Despite these improvements, there are still challenges on the software side. HPC is often highly tuned and, moreover, might run on a nonmainstream Unix or Linux distribution that has many proprietary tweaks. Examples include Catamount OS, which the U.S. Department of Energy's National Nuclear Security Administration Advanced Simulation and Computing Program uses on the Red Storm supercomputer, the Compute Node Linux used in some Cray models and IBM's Compute Node Kernel. These are all lightweight kernels that minimize OS overhead. It can be difficult to get these through any hypervisor certification process, especially if they involve device drivers. One might think the answer is to go directly to a certified and supported Linux distribution, but one major issue with parallelized operations is that a compute cycle -- say an iteration of a simulation -- isn't complete until the last server finishes.


Cohesity makes it easier to manage secondary storage

Cohesity makes it easier to manage secondary storage
Cohesity customers should see a marked reduction in total cost of ownership (TCO). The deduplication and data management capabilities will likely reduce the overall amount of data stored by more than 40 percent. The reduction in secondary storage is certainly nice, but the big savings to be had is operational. By my estimate, 60 percent of the TCO from secondary storage is in people costs. Like HCI has taken a chunk out of the operational costs associated with running different workloads, it will have the same impact on secondary data management. In fact, because this area lacks any kind of best practices or strategy, it’s likely to cut operational costs by 50 percent or more, freeing up valuable time for more strategic things. If you love HCI, you’re certainly not alone, as it’s been one of the fastest-growing IT technologies.


Why digital assistants are so hot right now


AI-enabled agents make an attempt to solve the “paradox of choice” that often leads to lower customer satisfaction and abandoned carts. Many retailers, including eBay, Walmart, and Whole Foods, bet on AI-powered virtual shopping assistants to fine-tune their offerings. ...Obviously, there’s a bigger picture for each individual benefit of digital assistants. On the one hand, the growth of voice assistants runs parallel with the progress in artificial intelligence, IoT, self-driving technology, and emerging interfaces based on text, audio, image, and haptic signals. The intelligent agent serves as a practical tool for today’s high-tech environment. It becomes indispensable to the normal functioning of the new generation of devices and emerging driverless cars, connected homes, and smart cities. On the other hand, AI-enabled assistants serve as a mediator between humans and innovation.


What is Asana? Task management tracking made easy

 project management knowledge sharing
“[Asana] allows teams in organizations to determine how they need to work together,” said Margo Visitation, vice president and principal researcher at Forrester. “Whether they want to work in a way that is driven by conversations or by tasks, they have the opportunity to work in the way that is comfortable for the team.”  Raúl Castañón-Martínez, senior analyst at 451 Research, said that Asana will benefit teams that previously relied on a diverse set of tools like spreadsheets, file sharing and even email and chat apps. “As projects grow more complex it becomes a burden trying to manage teamwork this way.”  The software is relatively easy to use when compared to more complex project management tools sich as Trello and Smartsheet and is aimed at a wide range of business professionals. Asana’s product design is one of its key strengths, said Castañón-Martínez.


Google's grand plan for health, from fitness apps right up to defeating death

hospital-doctor.jpg
Its best-known healthcare product is Streams, an app designed to decrease the incidence of acute kidney injury before it occurs by alerting clinicians to the warning signs that indicate a patient is a candidate for such an injury. The app itself doesn't contain any AI at present -- think of it as more simple analytics software for healthcare -- it's likely that such elements will make their way into the products in future. The system is being trialled with the Royal Free hospital, and may be extended to other conditions where picking up the right signs early on can prevent a full-blown life-threatening condition, such as sepsis, taking hold. Other partnerships with UK healthcare organisations show the direction of travel for DeepMind's products. For example, in pilots with the Moorfields Eye Hospital and University College London Hospital,


Heart-stopping cybersecurity threats — literally

Heart-stopping cybersecurity threats — literally
As the number of internet-connected medical devices and their respective vulnerabilities continues to grow, we must proactively take substantive steps to bolster their security and protect the Americans who rely on them by establishing health-care industry guidelines for how to best to defend against these types of radical cyber assaults. I was joined by my colleague, Rep. Susan Brooks (R-Ind.), in introducing the Internet of Medical Things Resilience Partnership Act, legislation that will bring public and private sector counterparts together to address the vulnerabilities of medical technologies by establishing a robust, yet malleable, comprehensible cybersecurity framework. We cannot stand idly by while these imminent attacks threaten the American people.


Beware the promise of a digital silver bullet

Digital Trends Workplace
It could be the Architecture Tribe where everything is about infrastructure. Or the Automation Tribe who declare that robots are the only way forward. Or the Radical Redesign Tribe who will tell you it’s pointless doing anything unless you completely rethink the company from the ground up. Meanwhile, all you may want to do is row the boat a bit faster. And we chose the word ‘tribes’ with good reason. This isn’t a cohort of people rationally discussing the rights and wrongs of all their diverse approaches; these are frequently groups with fervent and invested beliefs in their own technical specialisms, who may be fiercely competitive, yet still need to be knitted together with a common purpose to create an environment of change. So how can a leader do all that? We’ve been developing a series of tools to help, one of which is the Digital Change Curve.



Quote for the day:


"The two most powerful warriors are patience and time." -- Leo Tolstoy


Daily Tech Digest - November 01, 2017

Dremio: Simpler and faster data analytics
Dremio utilizes high-performance columnar storage and execution, powered by Apache Arrow (columnar in memory) and Apache Parquet(columnar on disk). Dremio also uses Apache Calcite for SQL parsing and query optimization, building on the same libraries as many other SQL-based engines, such as Apache Hive. ,,, Dremio is the first execution engine built from the ground up on Apache Arrow. Internally, the data in memory is maintained off-heap in the Arrow format, and there will soon be an API that returns query results as Arrow memory buffers. A variety of other projects have embraced Arrow as well. Python (Pandas) and R are among these projects, enabling data scientists to work more efficiently with data. For example, Wes McKinney, creator of the popular Pandas library, recently demonstrated how Arrow enables Python users to read data into Pandas at over 10 GB/s.


Bad Rabbit ransomware data recovery may be possible


The Kaspersky team wrote in a blog post that early reports that the Bad Rabbit ransomware leaked the encryption key were false, but the team did find a flaw in the code where the malware doesn't wipe the generated password from memory, leaving a slim chance to extract it before the process terminates. However, the team also detailed an easier way to potentially recover files. "We have discovered that Bad Rabbit does not delete shadow copies after encrypting the victim's files," Kaspersky researchers wrote. "It means that if the shadow copies had been enabled prior to infection and if the full disk encryption did not occur for some reason, then the victim can restore the original versions of the encrypted files by the means of the standard Windows mechanism or 3rd-party utilities."


Cybersecurity: How Blockchain Is Helping E-Commerce Businesses Protect Their Data


By using blockchain-based smart contract technology, e-commerce businesses can be confident that data on a global network is visible only to those who are authorized to receive that data in a timely manner, enhancing the security of transactions. Smart contracts are computer protocols that facilitate transactions. They help you exchange money, property, goods, services, or anything of value in a transparent way. But they also put a premium on security since only permitted parties have access to the data. That’s because blockchain-based smart contracts are visible only to those users permitted access to the blockchain. This ensures that only certain people have access to certain data and bars outsiders from gaining entry.


Is more IoT driving more cyber attacks?

3
For general users like us, right now, there are two kinds of cyber attacks: inbound and outbound. Inbound cyber attacks target our smart devices like phones, tablets, or cameras directly. DNS Amplification Attacks are common outbound attacks, with over 80% of family level cyber attacks resulting from router issues. To this point, Helpnetsecurity suggested three tips to actively avoid attacks. First, we need to periodically change the passcode of our smart devices and family Internet. Second, do not connect to unknown Wi-Fi and Bluetooth devices. Last but not least, upgrade device software in a timely fashion. Nowadays, both iOS and Android will send out an upgraded version regularly, even every app on our phone will release upgrades frequently. Some users think these upgrades are annoying and choose to shut down this function, but most of the upgrades are related to security issues.


Enterprise Architecture For The Internet Of Things

enterprise architecture for the internet of things
It is an understatement to say that the introduction of the Internet required major changes in enterprise architectures. IT was suddenly not only managing internal applications but had to take on an external-facing web access function which grew over time from providing basic information to being fully integrated with enterprise functions like marketing, sales, support, logistics, production, documentation, and engineering. As organizations started to take advantage of evolving Internet capabilities, new functions and structures evolved over time. CIOs, webmasters, and SEOs suddenly became critical to the enterprise, and as web and mobile applications spread, so did the substantial operational IT headaches of keeping everything fully tested, functional, and operational.


30 Percent Of CEO Emails Have Been Exposed In Breaches, Leaks

According to F-Secure’s research, the breaches that revealed the highest number of CEO credentials were from sites and services that one would commonly associate to the business or corporate world.Hacks business social network LinkedIn, which occurred in 2012 and exposed more than 117 million users, and popular cloud storage service Dropbox, which also happened in 2012 and resulted in 68 million account credentials being stolen, were responsible for 71 percent of all of the exposures. In addition to having their email addresses and passwords exposed, CEOs have also had other personal information leaked through breaches. Eighty-one percent of CEOs have had data including physical addresses, birthdates and phone numbers exposed, the researchers found—many of which came from spam lists and marketing databases that were stolen.


Even data scientists are facing AI takeover

Even data scientists are facing AI automation of their work
This aversion to the dreaded word “automation” may stem from the fact that even data scientists are starting to worry about its potential impact on their own job security. It’s with this cultural zeitgeist in mind that I read Andrew Brust’s recent article about Alteryx’s new tool for “operationalizing” machine learning models. He provides a very good discussion not only of the data-science productivity-boosting benefits of that offering, but of different solutions from other vendors that all, to varying degrees, push automation deeper into data-science development, deployment, and optimization workflows. ... Although Brust says there’s “nothing but upside” to the prospect of squeezing manual labor out the data-science workflow, it’s clear that many low-level functions, which might otherwise be handled by less-skilled (but nonetheless employed) data scientists might never be touched by human hands ever again.


6 Steps to Building a Business Case for Enterprise Architecture

illu-blog-en-business-case-for-ea.jpg
Once you’ve decided on a destination, your EA GPS will provide turn-by-turn instructions on where you are now, where you want to go, and how you’re going to get there. It can also make you aware of what resources you’ll need and have access to along the way, what risks and/or obstacles you’re likely to encounter, and how to navigate around them to arrive at your strategic destination. If enterprise architecture can provide the visibility and supporting information to achieve strategic goals, why aren’t more companies investing more aggressively in EA practitioners and tools? Continuing to choose spreadsheets and static diagrams as the source of record for your EA initiatives is like choosing a gas station map in the glove compartment instead of a GPS. Is this information still accurate? Who knows – at least the map was cheap, right?


Vancouver’s chief technology officer keeping the city ahead of the digital curve


“The number one issue we deal with, hands down, is usually not a tech challenge, it’s a cultural challenge. And then the public sector adds another dimension as well in the sense that we also need to deal with process challenges with quite a few regulatory and compliance requirements,” Adcock highlights. “The trick to digital transformation in the public sector is to try and achieve that best-in-class user experience and that DNA change within the organization, all within the parameters of what our mandate is and what we’re required to do. It’s an extra layer we have to consider.” She says that justifying a complete digital transformation within the public arena can be a challenge as well, given that it is not necessarily in the same competitive environment as a business would be.


Blockchain Could Help Us Reclaim Control of Our Personal Data

oct17-05-sb10064134e-001
At a whole system level, the database is very secure. Each single ledger entry among billions would need to be found and then individually “cracked” at great expense in time and computing, making the database as a whole very safe. Distributed ledgers seem ideal for private distributed identity systems, and many organizations are working to provide such systems to help people manage the huge amount of paperwork modern society requires to open accounts, validate yourself, or make payments. Taken a small step further, these systems can help you keep relevant health or qualification records at your fingertips. Using “smart” ledgers, you can forward your documentation to people who need to see it, while keeping control of access, including whether another party can forward the information. You can even revoke someone’s access to the information in the future.



Quote for the day:


"Real leaders are ordinary people with extraordinary determinations." -- John Seaman Garns