Daily Tech Digest - May 12, 2018

Boston Dynamics' SpotMini robot dog goes on sale in 2019


Who'll buy it? Probably not you, at least to start.  Raibert didn't reveal price plans, but said the SpotMini robots could be useful for security patrols or for helping construction companies keep tabs on what's happening at building sites. SpotMini can be customized with attachments and extra software for particular jobs, he said. Eventually, though, the company hopes to sell it for use in people's homes. "Most places have something where wheels don't get you everywhere," Raibert said. "We think SpotMini can go to a much larger fraction of places." Boston Dynamics is among the highest-profile robot companies out there. It made a bang with its gas-powered Big Dog quadruped, which could navigate challenging terrain while keeping its balance. Later, the company unveiled Atlas, a humanoid robot that can do flips, pick up boxes and can now run. SpotMini, whose development began while Boston Dynamics was a Google subsidiary, is remarkable for being cute, as well as fascinating to watch. That's pretty valuable given how leery a lot of us are about our future robot overlords.



Back to the Future: Demystifying Hindsight Bias


When using the original dataset, Information about the target label crept into the training data. Boat and body are only known in the future after the event has already occurred. They are not known in the present when making the prediction. If we train the model with such data, it will perform poorly in the present, as that piece of information would not legitimately be available. This problem is known formally as hindsight bias. And, it is predominant in real-world data, which we’ve witnessed first-hand while building predictive applications at Salesforce Einstein. Here is an actual example in the context of predicting sales lead conversion: the data had a field called deal value which was populated intermittently when a lead was converted or was close to being converted (similar to the Boat and Body fields in the Titanic story). In layman terms, it is like Marty McFly (from Back to the Future) traveling to the future, getting his hands on the Sports Almanac, and using it to bet on the games of the present. Since time travel is still a few years away, hindsight bias is a serious problem today.


Cloud-Based Product Lifecycle Management Market is touching new levels

HTF MI recently introduced new title on “Global Cloud-Based Product Lifecycle Management Market Size, Status and Forecast 2025” from its database. The report provides study with in-depth overview, describing about the Product / Industry Scope and elaborates market outlook and status to 2025. The Report gives you competition analysis of top manufacturer with sales volume, price, revenue (Million USD) and market share, the top players including Dassault Systemes, Siemens AG, PTC Inc, Oracle Corporation, SAP SE, Autodesk, Inc, Arena Solutions, Aras, Infor & Accenture PLC. In this report Global Cloud-Based Product Lifecycle Management market classified on the basis of product, end-user, and geographical regions. The report includes in-depth data related to revenue generation region wise and major market players in the Cloud-Based Product Lifecycle Management market.


The future for service – will you focus on AI, voice or search?


Sadly, service delivery today is anything but routine, predictable or scalable. Take a new application built in the cloud – an issue with the cloud provider could lead to all customers being locked out of their service. With each and every customer suddenly needing assistance, scaling up to cope with the problem is difficult; diagnosing the issue with a supplier is also tricky. Coping with a bigger problem and automating responses where possible is therefore necessary. In the State of the Service Desk Report, 13,000 service desk teams provided their insights into what is working and what is needed to cope in future. Around 69 per cent of front line responders stated that they spent too much time firefighting, rather than being able to plan ahead through better problem management. Similarly, around a quarter pointed to increased automation as essential for their efficiency. Yet each company will have to look at its own approach to automation – there is no one size fits all solution. There are a number of new options that service teams can take to evolve their approach – voice, AI and search. 


How to Achieve Sustainable Employee Engagement in Healthcare

Enabling employees to do meaningful work is critical to employee engagement, and requires a consistent feedback loop and the right systems and processes to support them. Technology can be a powerful accelerant that offloads mundane tasks and allows employees to apply their skills and expertise to the things that technology can’t do—innately human things that require empathy, connectivity, communications, and influence. Unfortunately, many healthcare organizations are still operating on legacy systems and their employees are bogged down by slow technology that prevents them from fully engaging in their jobs. These employees end up spending significant time working on things that they weren’t hired to do such as piecing together and fact-checking spreadsheets and reports—activities that they should be able to do within the technology. The right technology will allow your workforce to do their best work by making what encompasses their role more automated, manageable, and efficient. And as regulations and patient expectations continue to change, the systems you choose should be agile enough to change with your organization’s needs. 


Three Fintechs leading Open banking initiatives in the UK

Digital-Banking-Open-Banking-and-APIs-a-Trend-to-Watch-Closely-1440x564_c
As the world starts warming up to the Open banking culture, there is always going to be this tug of war between control and agility. As regulators tune their policies around data sharing and open banking, they will have to make decisions on how much control Financial services firms have over customer data. At the same time it is also critical to work towards an agile open banking framework within a controlled and secure data sharing ecosystem that takes care of customers’ interests. UK, like in most other aspects of Fintech, has been spearheading open banking in policy and execution, but it would be myopic to assume that open banking starts and ends in the UK. I have touched upon different regulatory approaches to open banking and customer data sharing across the globe in my previous posts. Today, I focus on three of my favourite Fintechs in the UK that are regulated to provide open banking services. ... These players and a few others not only add efficiencies for their business through open banking APIs and data analytics, but also create opportunities for businesses partnering with them.


The ethics lessons will continue until morality improves

So, why didn't Build start with that? For exactly the same reason that reactions to Google Duplex has been so divided: Because technology powered by AI has the potential to make our lives far, far better -- or far, far more unbearable. Microsoft showed a meeting room camera system that recognised people walking into the room, greeted them by name, and transcribed every word they said -- even if their deafness made them a little harder to understand. That deaf team member could join in at an equal level with everyone else, and so could remote colleagues. Everyone got a list of what they had said they were going to do, delivered to their to-do lists. Empowering and convenient -- exactly the kind of system the $25 million AI for Accessibility grant programme Nadella announced is there to create. The same system in a railway station in a country with an authoritarian government, or even left on in an HR meeting room where someone is trying to report an abusive boss, would be deeply worrying. Google showed its Duplex assistant phoning a restaurant and sounding enough like a human to be treated like a real customer.


The hybrid cloud provides a best of both worlds solution

Hybrid the best of both worlds
The direction that cloud services and cloud providers are heading in at the moment can quite accurately be described as two major points. Cloud providers seem to be focusing primarily on, number one: expanding their infrastructure and make it available in a number of different geographical locations, and number two: ensuring a variety of options and services be available for their users including IaaS and Paas layers so they are not turned away. One may raise the question of cloud providers not as actively working on creating security solutions but it is negated by the shared responsibility model currently adopted by them which envisions cloud security to be both, the provider and the user’s responsibility, equally. This is why a hybrid cloud system seems to be the ideal solution as it allows enterprises to remain on top of the tech race with the cloud yet be able to retain critical work on-premise to ensure its utter security. Despite a great number of entrants finding a haven in the cloud and data centre technologies, a proper and flexible security solution for the hybrid cloud systems still remains to be formulated.


Coaching with Curiosity Using Clean Language and Agile


The Clean for Teams training is all about getting the team to be curious and supportive of each other using Clean Questions. It works wonders as long as people use no more than three questions in a row at a given time, keeping it light and not going as deep as you might in professional performance coaching or therapy.  In a recent workshop I gave, two colleagues were pairing up to practice the questions that they had just learned. They decided to use as a topic a discussion they had had the prior day at work. During the debrief, one commented that the trajectory of the conversation had been richer and more revealing than had been the conversation the day before. They used only a few questions and had had only 15 minutes of exposure to Clean Language. So yes, it is possible with the right guidance to put it to use in your everyday work, whether in a coaching relationship or not. You will experience an improvement in the way people relate to you and you to them, which is one of the outcomes of good coaching. The conditions for peer-to-peer coaching include having a space to listen, and a technique to separate out your own thinking so that you can stay within the mental model of the person you are listening to.


Understand Microservices Monitoring


The ultimate goal, of course, is for processes, errors, and bottlenecks to be managed in ways that are totally transparent to end users, as microservices-based platforms fix themselves with the help of microservices analytics. In the event of a bottleneck, for example, an end-use customer who tries to buy a widget or service on the Web would ideally never receive an error message that might prompt the user to “try again later.” Developing microservices orchestrations and associated analytics capabilities are easier said than done in-house, of course. To that end, third-parties have emerged with solutions and services for those organizations that lack resources to develop the architectures in-house.“Microservices are moving toward mainstream use today and often show many integration points with existing monolithic enterprise applications,” Torsten Volk, an analyst for Enterprise Management Associates (EMA), said. “Meanwhile, vendors of DevOps-centric application and infrastructure analytics software are stepping up to monitoring this often complex and dynamic world of applications consisting of shared services with often disconnected release schedules.”



Quote for the day:


"To make a decision, all you need is authority. To make a good decision, you also need knowledge, experience, and insight." -- Denise Moreland


Daily Tech Digest - May 09, 2018

Europe may come to regret its new set of data rules


Worse, the rules could impede innovation. Many blockchain companies could be shut out entirely. Cloud computing may become substantially more complicated. Systems that rely on artificial intelligence could in many cases be incompatible with the GDPR’s mandates. It’s an ominous sign that Facebook has already started pulling some data projects from Europe. Yet all this is more or less by design; there will also be unintended consequences. Although the GDPR aims to improve data security, for instance, its privacy rules may compromise a crucial tool used by security researchers, thereby increasing spam, phishing attacks and malware. Its compliance costs could inhibit cybersecurity investment. Its emphasis on obtaining consent for data collection is, in practice, likely to mean endless “click to proceed” boxes that leave customers little more informed — and significantly more irritated — than before. For all these drawbacks, the EU deserves credit for illuminating — and attempting to resolve — a very real problem. European law enshrines a right to privacy. 


In Cybersecurity, Accountability Could be the Ultimate Innovation

Sacrificing short-term gains to reinforce the company’s mission has understandably been a big positive for their brand—and it’s been great for their business. In December of 2017, CVS announced it would buy Aetna, a move that could very well reshape the health insurance landscape in this country. Cybersecurity is an industry that can desperately use a dose of accountability-as-innovation. Accountability in cybersecurity is virtually non-existent. Despite billions of dollars spent worldwide on cybersecurity solutions, our position in cyberspace is now more precarious than ever. Recently, the World Economic Forum’s (WEF) Global Risks Landscape 2018 ranked cyber attacks alongside extreme weather events and the prospect of nuclear war as the most likely and dangerous risks threatening the stability of society. That means, on the internet, “attackers could trigger a breakdown in the systems that keep societies functioning.” Which we just saw happen last month when cyber actors held critical services provided by the city of Atlanta for ransom and even took Baltimore’s emergency 911 response system offline.


Forget Windows; Microsoft is now all about the cloud

open windows clouds
Windows resides in the More Personal Computing segment, the revenue leader, but don’t let that deceive you. A closer look tells the real story. ... There’s no breakdown of Windows versus cloud, but Microsoft did say the Azure public cloud’s revenue boomed 93% year over year. The previous quarter it grew 98% year over year. And Microsoft also said that what it calls its “commercial cloud,” made up of Azure, Office 365, Dynamics 365 and other cloud services, brought in $6 billion in revenue in the third quarter, which was up 58% year over year. The More Personal Computing segment was up far less — only 13% year over year. Also notable in the third quarter: Windows and Devices chief Terry Myerson left the company. You can be sure he didn’t depart because Microsoft was going to devote more attention to Windows. Keep in mind, also, that a lot of Microsoft products are now essentially cloud-based, so there’s even more cloud revenue at the company than first meets the eye. Microsoft Office, for example, is increasingly a cloud service, with the company pushing Office 365 heavily over the client version of the Office suite.


What is an API? Application programming interfaces explained

What is an API? Application programming interfaces explained
Diving a little deeper, an API is a specification of possible interactions with a software component. For example, if a car was a software component, its API would include information about the ability to accelerate, brake, and turn on the radio. It would also include information about how to accelerate: Put your foot on the gas pedal and push. The “what” and “how” information come together in the API definition, which is abstract and separate from the car itself. One thing to keep in mind is that the name of some APIs is often used to refer to both the specification of the interactions and to the actual software component you interact with. The phrase “Twitter API,” for example, not only refers to the set of rules for programmatically interacting with Twitter, but is generally understood to mean the thing you interact with, as in “We’re doing analysis on the tweets we got from the Twitter API.” Let’s dig in by looking at the Java API and the Twitter API as examples. First, we’ll get a quick picture of these two APIs and how they fulfill the definition of “what” and “how.” Then, we’ll talk about when you’ll likely use APIs and what goes into a well-designed API.



Antipattern of the Month: Unresolved Proxy

Image title
Any proxy must be respected as having executive authority regarding value, so as not to be undermined. This includes authority over the articulation and ordering of work on a Product Backlog and how it is represented to the Development Team. The proxy must be a genuine and competent representative of the "real" PO, and recognized as being fully able to take decisive action and to provide information in a timely way. Unfortunately, though, a proxying model can be resorted to as a salve when genuine product ownership is weak. Stakeholders might expect a certain product capability to be available, but none may necessarily wish to own it. This can be the case with middleware for example. Several proxies might then be used, each of whom will represent certain capabilities on behalf of a notional though absent Product Owner. Great discipline is needed when a single clear proxy is unrecognized, since all must then agree to establish compensatory protocols through which they collaborate beyond their narrow interests.


Why CEOs Should Embrace Minimally Viable Moves

minimally viable moves
Minimally viable moves allow companies to pursue big bets with incremental amounts of risk instead of big sweeping chunks. It’s akin to an MVP (minimally viable product), which is designed to represent just enough of a new market-facing offer that you can get real feedback about it and course-correct as necessary. An MVM involves making just enough of an organizational change to determine whether or not the move will be valuable to your business. This is beneficial and empowering for business leaders at all levels. Instead of feeling that responding to disruption is equivalent to betting the farm, MVMs provide enough cover so that if mistakes happen, decision-makers don’t feel forced back to the drawing board. Going slow and steady allows for on-the-fly adjustments and never having to double back because of hastiness. Which minimally viable move you make depends on your organization and your objectives. For example, you can alter protocol for a common type of decision, skip a management feedback step in preparing for a customer visit, or shift hiring practices for a certain role.


Windows critical flaw: This security bug is under attack right now

In an advisory crediting Qihoo 360 Core Security researchers and Kaspersky Lab malware analysts for discovering a critical bug tagged as CVE-2018-8174, Microsoft details a remote code execution flaw residing not in Internet Explorer but the Windows VBScript engine. However, it also explains the bug can be exploited through Internet Explorer. Microsoft hasn't confirmed this is the bug reported by Qihoo 360 Core Security but notes the flaw is being exploited in the wild. "In a web-based attack scenario, an attacker could host a specially crafted website that is designed to exploit the vulnerability through Internet Explorer and then convince a user to view the website," Microsoft notes. "An attacker could also embed an ActiveX control marked 'safe for initialization' in an application or Microsoft Office document that hosts the IE rendering engine." Observed attacks have started with a malicious Word document, which when opened downloads an exploit written in VBScript that's hosted on a webpage, according to malware analysts at Kaspersky Lab.


Google’s developer show highlights the promise and perils of its data hoard


Google has stressed that its data systems are more secure and it keeps information anonymous. “We’ve long had a very robust and strong privacy program at Google,” Pichai told investors last month. Yet Google already gives many Android app creators access to a sea of personal information, including location history and some shopping behavior. And it has been routinely criticized for the vast targeting in its advertising business and the spread of misinformation on search results and YouTube. Last week, Google said its latest security product, which restricts outside access to personal accounts like Gmail, was available for the iPhone. In March, it unveiled a new plan to stamp out fake news. Expect similar announcements at I/O. But the company will have to offer developers new features, too, some of which will likely give them fresh ways to track where people go and how they interact with their devices. There are about 25 conference sessions this week on the Google Assistant, a voice-enabled, AI-powered service that the company is trying spread further and faster than Amazon’s Alexa.


The Impact of MiFID II on Data Management: Q&A with MarkLogic's Ken Krupa

The volume of data that needs to be recorded makes the regulation a huge technology challenge. Many companies are finding that they have to update their technologies, infrastructures, and data management processes. To be compliant, firms need transparency and the ability to maintain a consistent view of the trade landscape at any point in time. All of these requirements will have a broad impact on data management and IT infrastructure, in large part because the old ways of dealing with data are no longer sufficient. The evolution of the IT infrastructure in the financial services industry has led to proliferation of systems and fragmentation of data. Also, the rapid rise of social media, instant messaging, forum usage, unstructured data as a source of new content, and trader behavior analytics has increased the amount of information that grows outside transactional systems. All of this new information now falls under the remit of compliance and business planning.


How to create a data strategy for enterprise IoT

When it comes to enterprise adoption of IoT, most deployments are still in a pilot or proof-of-concept phase, according to Forrester Research senior analyst Paul Miller. These projects are often driven by operational teams, and are not necessarily linked to enterprise-wide technology strategies for cloud or data. "A lot of these deployments are early, small, and often under the radar of central IT," Miller said. "As they become more mission critical, there will be a very real need to ensure that they do comply with things like data policies, privacy policies, and security policies. But it's still early days, and there's relatively little formal policy around IoT deployment at the moment." Most companies are examining how to manage their existing data, in terms of how to secure and extract value from it, said Mark Hung, a research vice president at Gartner. "Both the speed and scale of data that IoT brings is a new challenge," Hung said. With so many endpoints, companies need to prepare to manage a large influx of information that must be analyzed in close to real time to gain the greatest insights, he added.



Quote for the day:


"The tragedy of life doesn't lie in not reaching your goal. The tragedy lies in having no goal to reach." -- Benjamin Mays


Daily Tech Digest - May 08, 2018

Who wants to go threat hunting?

forensics threat hunter cyber security thumbprint
To become a threat hunter, one must first work as a security analyst and likely graduate into IR and cyber threat intelligence fields. Combined with a bit of knowledge of attacker methodology and tactics, threat hunting becomes a very coveted skill. Threat hunting is one of the most advanced skillsets one could obtain in information security today. The core skills of a threat hunter include security operations and analytics, IR and remediation, attacker methodology, and cyber threat intelligence capabilities. Combined, a hunter is the special operations team of an organization’s defensive and detection capabilities. A threat hunter is taking the traditional indicators of compromise (IoC) and instead of passively waiting to detect them, is aggressively going out looking for them. Traditional intrusion detection doesn’t do a great job on the crafty adversary. They will avoid tripping the normal intrusion detection defenses. It takes a threat hunter to find them. Not every company can have one. It takes a certain size and sophistication. ... Threat hunting teams need threat intelligence plus a network person, an endpoint person, a malware analyzer, and a scalable bunch of tools. A threat hunting team is like special operation forces.



Quantum computers and Bitcoin mining – Explained

Circuit Board Projecting Bitcoin
With the application of the science behind electrons, energy required to mine bitcoins will be drastically reduced causing a direct impact on the protection of the environment. It is immaterial where the computers are located because one of the properties that this technology employs is having objects in any place at the same time. Because of the energy challenge, most mining companies have opted to set up their data centers in regions that have cold weather conditions most time of the year as is the case of using the classical computers. Quantum computers adequately sort out that issue. The economics of mining will definitely improve with miners not having to get concerned about the exorbitant electricity bills they have to contend with. A large number of Mining companies have had to migrate to China where they can capitalize on the relatively cheaper electricity offered. This need not be case anymore as particles can exist in multiple locations at once with quantum computing. The technology of quantum computing is in itself an incentive or motivating factor in causing more people to desire to engage in conducting the mining activity.


Should you let IT teams choose their own collaboration tools?

teamwork - collaboration
While CIOs should consider stepping back from dictating what IT can use, they are still responsible for vetting, integrating and maintaining the solutions IT chooses, Palm says. “You’re not just a strategic advisor to the business as far as how these tools can enable efficiency and innovation, but you’re helping your teams choose the best tools to help them do the best job they can,” he says. Red Hat’s Kelly notes a fundamental issue CIOs encounter when shifting to a choose-your-own approach: “What you don’t want to do is completely let go, and then all of a sudden you have fifty different ways people are communicating with each other — that’s a mess. You as a CIO have to walk a line between standing there and saying, ‘You are required to use this and only this,’ and making it a free-for-all.” One chief concern is the possibility of constraints that regulatory compliance and data governance may have on these decisions, depending on your industry, Kelly says.


Financial sector cyber-related laws are a bellwether, says Deloitte


“While it is generally not possible to control when you have a crisis, quite often the cause of these crises is a cyber security incident, so it is worth information security teams in organisations engaging with the privacy teams to help understand where the organisation’s core risks lie, so they can prepare for these crises. A good response makes a huge difference.” Another thing that “absolutely attracts regulator attention”, said Bonner, is “pockets of complaints”, because even if the regulator does not have the resources to follow up on every single isolated complaint, if there are several customer complaints about a single organisation, the regulator will pay attention. “The lack of resources means that regulators will draw conclusions based on the nature and volume of the complaints,” he said. “So it could be by chance that a couple of entirely separate parts of your organisation have an issue that gets escalated to the regulator, but the conclusion will be that the organisation has a systemic problem.


Making sense of Handwritten Sections in Scanned Documents


It is challenging to achieve acceptable extraction accuracy when applying traditional search and knowledge extraction methods to these documents. Chief among these challenges are poor document image quality and handwritten annotations. The poor image quality stems from the fact that these documents are frequently scanned copies of signed agreements, stored as PDFs, often one or two generations removed from the original. This causes many optical character recognition (OCR) errors that introduce nonsense words. Also, most of these contracts include handwritten annotations which amend or define critical terms of the agreement. ... In recent years, computer vision object detection models using deep neural networks have proven to be effective at a wide variety of object recognition tasks, but require a vast amount of expertly labeled training data. Fortunately, models pre-trained on standard datasets such as COCO, containing millions of labeled images, can be used to create powerful custom detectors with limited data via transfer learning – a method of fine-tuning an existing model to accomplish a different but related task. Transfer learning has been demonstrated to dramatically reduce the amount of training data required to achieve state-of-the-art accuracy for a wide range of applications.


Smarter cities: why data-driven cities are the only option

New and exciting mobile and static infrastructure technologies are enabling safer communities, new low-cost utility services, more efficient city operations, and intelligent low-emissions transportation systems. One example of a UK smart city is Milton Keynes which showcases driverless pods that ferry citizens along fixed routes across the city. ... Smart city programmes involve mass deployments of sensors which are necessary to gather data to justify and manage change. While sensors can be very cheap, the deployment of these sensors can often be very expensive, especially at a city-wide scale. But rather than deploying an entirely new network of sensors, cities and local authorities can improve efficiency by leveraging existing sensor networks to avoid expensive infrastructure investments. Telematics companies may seem unlikely partners in this endeavour, but with some of the world’s largest organically grown vehicle datasets, telematics providers can grant access to aggregated data already blanketing cities across the globe – without the expense of building a sensor network.


11 Industries That Will Soon Be Disrupted By Blockchain

Let's face it. Many people are resistant to technological changes in both their personal lives and at the office. However, what they often lack is the vision to see how the new technology they are resisting will improve their lives in the future.Emerging technologies are exciting and bring innovation and new opportunities across the globe. They change our life by altering the way we think and operate on a daily basis. Technological innovation can impact a lot more than our daily lives. In fact, it can disrupt entire industries and change the way we do business.As new technologies are developed, affected industries are forced to adapt or be replaced.The newest technology that is quickly becoming the next major disruption is blockchain technology. Blockchain is a digital ledger system used to securely record transactions. It is poised to impact the way business is done across the globe.Here are nine prominent industries that are slated to be overhauled by blockchain technology in the near future.1) Gambling IndustryThe day of coins falling from slot machines when someone hits a jackpot are long gone. But the coin could soon make are return, but in a digital form via blockchain. RAcoin's mission is to make the blockchain technology an essential part of the traditional gambling industry and distribute RAcoin as a properly featured gambling cryptocurrency. 2) Payment IndustryKora is building an infrastructure for cross-border payment that facilitates financial transactions between people and/or businesses in a more transparent way using the blockchain. Other value-add like Identity and interoperability across other range of financial services are things we also bring to the table. Kora's use of blockchain will soon unlock growth in emerging markets by connecting people, communities and capital. Their services include the ability to access marketplaces, to make payments, transfers, investments, and to lend & pool capital across any community. Kora's native token has been named, KNT, it is the medium for interaction to be used on their in the future.3. The Real Estate IndustryAnyone who has ever purchased or sold a home knows just how much paperwork is involved in a real estate transaction. Blockchain technology can completely change the current headache that all of these documents cause. By using blockchain, all of the documents and transaction records can be stored securely with measurably less work and less cost.According to Piper Moretti, CEO of the Crypto Realty Group and licensed realtor, the use of blockchain can also potentially eliminate the escrow process. The technology can create smart contracts that release funding only when the conditions are met.Additionally, many people in the process of working with a real estate agent know how frustrating the commission rates can be, with many charging up to 6 percent. Deedcoin is looking to change that with its cryptocurrency-powered platform. Through using Deedcoin's platform and proprietary tokens, those rates decrease to just 1%. Deedcoin's distributed architecture gives power back to homeowners and buyers by tokenizing the process and eliminating any middlemen, barring direct interactions between agents and customers. 4. The Healthcare IndustryThe healthcare industry has been in need of a significant disruption when it comes to sharing and storing medical data and records.The potential for error, fraud, and lost records has created distrust between consumers and healthcare providers. Blockchain technology can revamp the trust by securely storing medical records that can be accurately and safely transferred to and accessed by the doctors and people who are authorized. Blockchain will aid in the authorization and identification of people. In fact, one startup called Ontology is already working to make positive, multi-source identification a reality across all industries using the blockchain technology.5. The Legal IndustryBlockchain technology is poised to disrupt some areas of the legal industry by being able to store and verify documents and data. For example, litigation dealing with resolving concerns over wills of the deceased or any other documentation can be eliminated. Records (including wills) stored on the blockchain will be quickly and securely verified. Any changes to the documents will be authenticated and stored.Blockchain technology can also eliminate legal issues dealing with inheritance, even including cryptocurrency assets. Safe Haven, for example, gives users the opportunity to secure digital assets so that the investor's legacy can be passed down to his children or designee safely and securely. This technology eliminates lengthy court battles arguing over digital inheritance.6. The Cryptocurrency Exchange IndustryDigital money is the way of the future, and it is thanks to blockchain that it can be securely transferred and recorded. However, the "mining" required to verify and authenticate every transaction of digital money requires an enormous amount of computing power. In recent years, this has created a lot of issues on several platforms when certain transactions "ran out of gas" or fizzled out due to the sheer amount of computation required. This issue was costing users valuable time and money.New developments in blockchain technology are changing the way the cryptocurrency exchange industry operates. Zen Protocol has developed an alternative to other platforms, which has solved the most significant issues in the cryptocurrency space. Unlike other platforms, Zen Protocol utilizes smart contracts that know in advance how much computation each contract requires. That means that unless there is enough "gas" to support that contract, it won't run.7. PoliticsIn the recent past, government parties here in the U.S. and around the world have been accused of rigging election results. But that won't be possible if blockchain is used because it would take care of voter registration and verification of identity, and it would count the votes to ensure only legitimate votes were counted. Gone are the days of recounting votes and voting day drama.8. The Startup IndustryWith thousands of startups looking for investors, there is no current way for them to get in front of the right investors without jeopardizing the security of their ideas. Likewise, there is no right way for investors to find the companies they are interested in backing. Blockchain technology can change all of that. In fact, it has already started. Companies such as Pitch Ventures are creating a way for startups to pitch investors live in a secure manner. Entrepreneurs create summaries of their product or service and investors can quickly sort and find potential opportunities. Ethereum's Smart Contract address allows a secure medium for the pitches, so privacy is maintained.9. The Video IndustryVideo is predicted to form 82% of all Internet traffic by 2021, and blockchain may play a significant role by decentralizing the video infrastructure. Decentralizing video encoding, storage, and content distribution will dramatically reduce the cost of video traffic by tapping into $30 billion in wasted Internet computing services. Startups like VideoCoin are already making good on the promise of freeing up this capital, which will allow entirely new and innovative ecosystems of video apps to emerge on the market.10. The Education IndustryThe education industry is poised to see some significant breakthroughs utilizing an emerging version of the Internet that combines blockchain, cryptocurrency, and virtual reality. This new Internet will be known as "3DInternet," and it has the power to create a global classroom like never before. SocratesCoin is making big moves to make this a reality. The company will create a global community of faculty, students, campuses, and curriculum. The students will encompass all ages, cultures, and locations. SocratesCoin has secured Nauka University, which will utilize 3DInternet to unite science, thought leadership and science through education. Blockchain-distributed ledger technology provides a safe and auditable way to record and transfer data. It can transform the way we live our everyday lives and disrupt any industry that uses data or transactions at all. 11. The Banking IndustryBlockchain technology has the potential to solve several significant problems faced by the banking industry today. Right now banks store money for their customers, and they also handle the transfer of that money.Blockchain inherently has a secure system that would provide permanent records of the millions of transactions that take place in the banking industry each day. This ledger system could significantly lower the risk by providing secure records. Furthermore, money could be transferred cheaper and faster by the decentralization provided by blockchain.And all of this disruption is a good thing.Whether or not you like to introduce new tech into your life, I think we can all agree that added security to our financial data would give everyone more peace of mind.About the author:John White is the CMO and founder of Social Marketing Solutions. White writes at the crossroads of social media, entrepreneurship, startups, and marketing.
Many people are resistant to technological changes in both their personal lives and at the office. However, what they often lack is the vision to see how the new technology they are resisting will improve their lives in the future. Emerging technologies are exciting and bring innovation and new opportunities across the globe. They change our life by altering the way we think and operate on a daily basis. Technological innovation can impact a lot more than our daily lives. In fact, it can disrupt entire industries and change the way we do business. As new technologies are developed, affected industries are forced to adapt or be replaced. The newest technology that is quickly becoming the next major disruption is blockchain technology. Blockchain is a digital ledger system used to securely record transactions. It is poised to impact the way business is done across the globe. Here are nine prominent industries that are slated to be overhauled by blockchain technology in the near future.


Microsoft's Project Brainwave brings fast-chip smarts to AI at Build conference


Project Brainwave brings two important differences to conventional AI. First, it uses a fast and flexible but unusual processor type called an FPGA, short for field programmable gate array. It can be updated often to accelerate AI chores with the latest algorithms, and it handles AI tasks rapidly enough to be used for real-time jobs where response time is crucial. Second, customers eventually will be able to run the AI jobs with Microsoft hardware at their own sites, and not just by tapping into Microsoft's data centers, which speeds up operations another notch. "This is a unique offering," said Forrester analyst Mike Gualtieri. The project is a microcosm of the AI revolution sweeping the tech industry. On the one hand, it's maturing fast enough to become useful for countless tasks -- digesting legal contracts, finding empty parking spaces, looking for hiring biases and generating 3D models of people's bodies, limbs and heads from a video.


More time equals more opportunity for cyber attackers


Given enough time, a criminal siphoning data can slow the attack down to a level where it may look like normal network traffic noise, rather than attempt to send out gigabytes of data from a database, for example. New data can also be gained over time, such as new oil well exploration or pharmaceutical research. If this arrives in an already compromised database, the attacker is positioned, ready and waiting, and only needs to exfiltrate it. Third, a rushed attack can often be rolled back to a previous backup without too much trouble or data loss. If exploitation of a database occurs today and is discovered, restoring the database leaves only a short batch of transactions that may need to be updated, once the route in has been strengthened. As a result, the business impact is low. Conversely, an attack that takes place over many months may mean long periods of compromised backups, requiring extensive manual work to rebuild from the last known successful backup. In extreme cases, reliance on these backups may not be possible as tapes deteriorate or are reused/recycled.


What is edge computing?

As centralized as this all sounds, the truly amazing thing about cloud computing is that a seriously large percentage of all companies in the world now rely on the infrastructure, hosting, machine learning, and compute power of a very select few cloud providers: Amazon, Microsoft, Google, and IBM. ... The advent of edge computing as a buzzword you should perhaps pay attention to is the realization by these companies that there isn’t much growth left in the cloud space. Almost everything that can be centralized has been centralized. Most of the new opportunities for the “cloud” lie at the “edge.” So, what is edge? The word edge in this context means literal geographic distribution. Edge computing is computing that’s done at or near the source of the data, instead of relying on the cloud at one of a dozen data centers to do all the work. It doesn’t mean the cloud will disappear. It means the cloud is coming to you.



Quote for the day:


"Leadership is working with goals and vision; management is working with objectives." -- Russel Honore


Daily Tech Digest - May 07, 2018

The Impending Facial Recognition Singularity

Facial Recognition
The result is that you will be identified every time your picture appears. If you use a real photo as your avatar, then your accounts will be connected, even if you use different names, account IDs, and email addresses. Even if you post an untagged photo of yourself to a site, the surrounding text will probably allow the system to know that you are in the picture somewhere; and after a hand full of pictures, it will be obvious to the computer which face is yours. On the opposite side of the spectrum however, an account with absolutely no photos might prevent identification, but it will stand out as fake.  The offline situation is no better. Cameras are becoming so inexpensive that they are built into all kinds of things. Cameras can be a cheap and easy way of allowing computers to sense and react to their environment. For example, cameras have been built into thermostats, smoke detectors, door bells, and toys. Overtime, all of the camera equipped IoT devices have created an Internet of Cameras (IoC). When paired together, government and private cameras provide almost complete coverage of our lives. Soon we will be seen and recognized everywhere we go.


Building Cybersecurity Shock Absorbers For The Enterprise

cyber resilience shock absorber
You know your data best, he continues, you know which “systems...are most important, what is the downtime that you can afford to have, what is the data move, where does the data exist.” Outside parties aren’t in your company every day. The only way they understand your priorities is through you. That doesn’t mean you shouldn’t look beyond yourself for advice. Building resiliency across the entire organization takes everyone. Non-security colleagues may have better ideas than you think. Mignona Cote, global head of identity and access management for insurance company AIG, notes that there’s a department in every business that’s mitigated risk much longer than infosec: accounting. “The finance people have been control people for years, way before we were,” she explains. “When I was an IT person and tried to do something with numbers or whatever, it always knocked the general ledger out of balance and people would come looking for me. They actually knew how to look at the logs -- the transaction logs -- which [security] never really embraced. There's a level of control that we need to focus on outside of what we typically do as IT professionals.”


Google could be getting serious about IoT with release of Android Things

Raspberry Pi 3 and Android Things
The idea behind Things is to provide a unified, one-size-fits-all software option for the developers of constrained devices like smart displays, kiosks and digital signage, among others. Device makers won’t be allowed to modify parts of Android Things’ code, specifically the parts that ensure Google can flash updates to all devices running the software at any time. That’s a potentially major sea-change for the IoT should Things use become widespread. If security is far and away the biggest stumbling block to IoT deployments, the inability or unwillingness of some device makers to regularly update their software to patch known security holes is arguably the biggest part of that problem. Regular, guaranteed software updates could go a long way toward making IoT more attractive to the more risk-averse enterprise and industrial users that will account for all that exponential growth being predicted for the IoT marketplace. There’s many a slip ‘twixt the cup and the lip, of course – Things is architected as more of an entry-level, consumer-style product at this point, for starters. But the multiplicity of developer sessions scheduled for it at this year’s I/O conference suggests that Google is serious about moving the framework forward as an option for device makers, and broadening its appeal among them.


How Mobile AI Will Transform Our Lives


The future of mobile AI is rapidly progressing. Businesses involved in the component manufacture and app development for the mobile phone industry aim to make improvements in the following areas. Better components and hardware features improve the ability of a mobile device to gather information from its surrounding environment. Previously, the phone camera was just a way to capture images and record videos, while the microphone was a way for the user to communicate during calls. In the mobile phone of the next generation, the camera and microphone will act as the eyes and ears of the intelligent phone. These components are expected to give the phone the ability to become aware of the world around it and make recommendations for its users’ benefit. Add the face recognition and GPS location feature to the mix and we come very close to a device that can understand its users’ wants and act as an assistant rather than just a communication device. The face recognition feature is particularly useful, as it would give the phone the ability to recognize the user’s emotions. The device would know when the user is sad, happy, or hungry.


How to Collect Meaningful Data

The hard truth here is that bad data leads to bad decisions. Thus, it is important to take the time necessary to build a proper data collection process. Two weeks ago, as I completed my big data certification, the importance of proper data collection became clear. It also reminded me of some basic data collection techniques I learned during Six Sigma training. That's what I want to share with you today. There are many benefits to building a proper data collection process. The primary benefit will be to the teams that need to sift through the data for insights. The sooner they get value from the data, the better. This saves time and money for everyone involved. Having a proper data collection process allows you to document what data is being collected, by whom, and for what purpose. Your data collection process should be part of a larger data governance strategy. Unfortunately, data governance is one of those things that happens after a company grows to a certain size. (So is data security, but I digress.) Here's a simple process outline for you to review. It's worked well for me over the years. Feel free to adopt or change for your own needs. Use whatever you can to build a data collection that helps you gather meaningful data.


Deep learning comes full circle

Artificial Intelligence digital concept; illustration of brain as connected network
Whatever the underlying reason, insights gleaned from the 2014 study led to what Yamins calls goal-directed models of the brain: Rather than try to model neural activity in the brain directly, instead train artificial intelligence to solve problems the brain needs to solve, then use the resulting AI system as a model of the brain. Since 2014, Yamins and collaborators have been refining the original goal-directed model of the brain’s vision circuits and extending the work in new directions, including understanding the neural circuits that process inputs from rodents’ whiskers. In perhaps the most ambitious project, Yamins and postdoctoral fellow Nick Haber are investigating how infants learn about the world around them through play. Their infants – actually relatively simple computer simulations – are motivated only by curiosity. They explore their worlds by moving around and interacting with objects, learning as they go to predict what happens when they hit balls or simply turn their heads. At the same time, the model learns to predict what parts of the world it doesn’t understand, then tries to figure those out.


The Hub Problem with Distributed Backup

Organizations need an alternative to using the primary data center as the centralized hub. The cloud may be the ideal hub. In a cloud model, IT sends primary data center and remote office data to a public cloud provider, which acts as the centralized repository. Data is copied once and there is one primary store of all backup data. Some solutions will cache data at each remote office and the primary data center so that restores of recently protected data can be quickly serviced but the actual movement of data is just one step. A DR copy can automatically be created by replicating the cloud copy within the cloud infrastructure. The other advantage of using the cloud as the hub is that it almost guarantees remote management will be of high quality since all sites essentially require remote management. It also means that traveling or vacationing IT personnel can remotely manage the primary data center data protection process. Using the cloud also better positions the organization to adhere to various legal and regulatory standards that dictate where data can reside since the larger public cloud providers have multiple data centers in multiple regions.


Why Network and Security Operations Centers Should be Doing More

Even though a NOC or a SOC consolidates a variety of tools and measurements into a single management system, they are still too isolated. Rather than this siloed approach, what’s needed is a new approach, with a system that can bring security visibility and control into the NOC, and provide operational requirements and network and workflow visibility to the SOC. By combining these systems into a single, holistic solution organizations can focus on the bigger picture of “secure throughput” that can streamline operations while managing and even anticipating critical security events.  This new approach could also help overworked IT teams operate with the benefit of the other’s perspective, and enable organizations to realize a new level of protection and operational management that can simultaneously adapt to network changes. Not only will this added insight allow organizations to see events more clearly, but it also enables the development of effective automation that allows the network to respond to an event at digital speeds without impacting critical business processes. 


Eight things to expect at Google I/O 2018


I/O will be more consumer and developer facing, so we should expect to hear more about products like Google Lens, as well as the company’s TensorFlow platform and its Tensor Processing Unit chips. Those chips are the core of the company’s specially designed AI training systems, and they help the company accelerate the learning process for its neural networks. Also expect to hear a lot of the same grandiose predictions about AI that we heard onstage at Facebook’s F8 developer conference last week, when executives also described AI as the future of Facebook’s business. Of course, it’s no surprise that Google and Facebook compete for top talent, as both companies have rival AI research divisions that command some of the highest salaries in the tech industry. ... Google Assistant and the Google Home hardware family it primarily lives on are slated to be big consumer-facing focuses for the company at this year’s I/O. Assistant remains Google’s largest competitive push against Amazon’s Alexa and, to a lesser extent, Apple’s Siri and Microsoft’s Cortana. And while Assistant does live on iOS and Android devices as an app and voice interface, it’s most readily useful as the OS layer for any number of smart home devices, starting with Google’s smart speaker family.


The 7 Fundamentals of IT Consultant Success

The 7 fundamentals of IT consultant success
Don’t understate the value of the insights you gained working in their industry, Perkins says. “They are what will differentiate you in the early days of your consulting career,” he says. “Others will know the methods, tools and craft skills of consulting, but few will have the depth of industry-specific insight you bring to the table. Trade on this.” As you develop a sense of which industry sectors most interest you, seek out assignments that will extend your expertise, Perkins says. “Your value increases the deeper you go,” he says. “And conversely, actively manage yourself away from industry specializations that don’t interest you.” Early in his consulting career, Perkins was assigned to two large agricultural chemical clients in a row, and was beginning to be referred to as the “AgChem” subject-matter expert. “Nothing wrong with AgChem, but I fancied myself a financial services technology strategist and took steps to gain experience in other areas,” he says. “At the same time, though, don’t neglect the emerging technologies and methodologies that will keep you attractive to a broad range of client and assignment types.”



Quote for the day:


"When leaders are worthy of respect, the people are willing to work for them. When their virtue is worthy of admiration, their authority can be established." -- Huananzi


Daily Tech Digest - May 05, 2018

Besieged Cambridge Analytica Shuts Down

Besieged Cambridge Analytica Shuts Down
Tuesday at Facebook's F8 developer event, the social media giant announced a number of measures to put the control of data use back in the hands of the user, including the ability to scrub all data. "Cambridge Analytica should be viewed as a cautionary tale for any firm handling personal data," says Julie Conroy, director at Aite Group. "Just as the rash of breaches took cybersecurity to a C-suite and board-level issue over the past few years, the firestorm around Cambridge Analytica's various abuses illustrate why consumer data control and privacy also need to be top of mind issues for all company executives." When the news of the Facebook data leak scandal broke in March, the scale of the impact and aftershocks became quickly apparent. Facebook's CEO, Mark Zuckerberg, eventually testified before U.S. House and Senate committees about the firm's privacy practices. Because Zuckerberg has failed to appear before Collins' committee, despite repeated requests, Collins warned Facebook in a Tuesday letter that he's prepared to issue a summons for Zuckerberg's appearance.



What is IO Acceleration? – JetStream Software Briefing Note

Caches are small and volatile; IO Acceleration is large and durable. Caches were designed when memory based storage was very expensive. If the organization could access 50% of its IO operations from cache that was considered effective but it still meant that 50% of the traffic had to cross the network and access data from hard disks. In-memory caches are not durable, meaning that power loss means data loss. The potential for data loss meant they were not safe for write caching so all writes had to go to the hard disk tier. While writes make up less of the IO distribution of the typical environment, they are the slowest part of the IO chain. Flash writes data slower than reads and each write often has an additional set of writes associated with flash management and data protection. The typical sizing of IO Acceleration, on the other hand, allows it to service 90% or more of all read requests and its design lets it work with a variety of storage devices including all-flash arrays and even the cloud. IO acceleration also has durability; protecting data outside of the system on which the acceleration software is installed so that power failure or even server failure does not result in the loss of data.


Blockchain: Prep starts now; adoption comes later


To avoid investments in hardware, early blockchain experiments will likely take place on pay-per-use models such as public cloud. While this will allow projects to scale, companies will have to contend with issues related to costs, security, data privacy, compliance, and vendor lock-in. In the meantime, what should companies do to prepare for blockchain? Now is a good time to start evaluating use cases. Vendor or conference workshops can help educate IT and line-of-business executives to what blockchain can do today and get started on documenting the processes for specific use cases. Workshops are an opportunity to get both internal and external parties involved. Generally speaking, companies wouldn't use a blockchain inside an organization. A stronger value proposition is provided by a consortium blockchain that crosses multiple organizations, which establishes a trusted mechanism for recording transactions, implementing smart contracts, and building other blockchain applications.


Digitization makes the Supply Chain agile and customer-related

Internet of Things
Industry 4.0 truly adds value to operations by providing the capability of analysing large amounts of data. Big Data analytics is one of the pillars of this new revolution and supply chain personnel need to understand that there would simply be more information coming their way. Everything involved in a process, right from a warehouse rack, to a guillotine machine, to a supply container, will have the ability to communicate, which will then require analysis and the CSCO needs to be ready for this. Highly automated process equipment and complex IT infrastructure does not eliminate the need for workers. On the contrary, it creates the need for highly skilled workers, who can effectively utilise the information available at their disposal. Future workforce would need to be competent at problem solving and systems engineering. It is crucial for a leader to understand the current workforce and their capabilities, in order to help modify the existing human resource to be ready for the challenges Industry 4.0 brings. Another key aspect for the CSCO to consider would be the end-to-end visibility across the supply chain.


Why Google Assistant could help make Android wearables more business-friendly

With the latest changes to Wear OS, Google has added two features to make using your watch as an organizational tool easier and more precise. Smart suggestions will generate options to narrow a query, with Google using the example of asking Wear OS about the weather. When a user asks about the weather, the current temperature and conditions appear on the screen—nothing is new there. What is new are suggestions available with a swipe up from the bottom of the screen: An evening forecast, weekend weather, and other recommendations appear as tappable buttons. Suggestions are available for various interactions and functions, similar to Google Assistant suggestions on Android smartphones. Google said it designed smart suggestions for quick interactions on the go, which can be great if you don't want to have an extended conversation with your wrist in public. The second productivity feature Google added to Wear OS is spoken responses, which can be a huge boon for busy people. Instead of displaying text for certain interactions on the screen, Assistant will now speak out loud via a watch's internal speaker or connected Bluetooth device.


9 machine learning myths


Machine learning is proving so useful that it's tempting to assume it can solve every problem and applies to every situation. Like any other tool, machine learning is useful in particular areas, especially for problems you’ve always had but knew you could never hire enough people to tackle, or for problems with a clear goal but no obvious method for achieving it. Still, every organization is likely to take advantage of machine learning in one way or another, as 42% of executives recently told Accenture they expect AI will be behind all their new innovations by 2021. But you’ll get better results if you look beyond the hype and avoid these common myths by understanding what machine learning can and can’t deliver. Machine learning and artificial intelligence are frequently used as synonyms, but while machine learning is the technique that’s most successfully made its way out of research labs into the real world, AI is a broad field covering areas such as computer vision, robotics and natural language processing, as well as approaches such as constraint satisfaction that don’t involve machine learning. Think of it as anything that makes machines seem smart.


NSA: The Silence of the Zero Days

Many organizations would do well to focus more on locking down their systems, and worry less about whether they might get targeted by a zero-day attack. "At the end of the day, if you're bleeding from the eyeballs, just stop the bleeding," BluVector's Lovejoy told me. But as the Equifax breach dramatically demonstrated, it's tough to keep track of all patches. According to software vendor Flexera's Secunia research team, the number of documented, unique vulnerabilities in software increased from 17,147 in 2016 to 19,954 in 2017 - a 14 percent increase - across about 2,000 products from 200 vendors. The good news, Flexera's Alejandro Lavie told me at RSA, is that "86 percent of [newly announced] vulnerabilities have a patch available within 24 hours of their disclosure." But as the NSA's Hogue warned, patches can be quickly reverse-engineered by hackers - criminals, nation-states or otherwise. So organizations need to do a better job of hardening their hardware and software, including not only tracking but also applying patches everywhere they're required, as quickly as possible.


GDPR could be Facebook's toughest data management test yet


One of the most heated exchanges came between conservative minister Julian Knight and Schroepfer, the article said, with Knight saying Facebook was a "morality-free zone," destructive to privacy, and not an innocent party that was wronged by Cambridge Analytica. "Your company is the problem," he said. Facebook’s vice president and chief privacy officer Erin Egan and vice president and deputy general counsel Ashlie Beringer recently posted an update about its GDPR compliance plans and new privacy protections. They introduced new “privacy experiences for everyone on Facebook” as part of GDPR compliance, including updates to its terms and data policy. All users will be asked to review information about how Facebook uses data and make choices about their privacy on the social network. The company said it would begin by rolling these choices out in Europe. "As soon as GDPR was finalized, we realized it was an opportunity to invest even more heavily in privacy,” the posting said. “We not only want to comply with the law, but also go beyond our obligations to build new and improved privacy experiences for everyone on Facebook.”


A Multi-Gateway Payment Processing Library for Java

J2Pay is an open source multi-gateway payment processing library for Java that provides a simple and generic API for many gateways. It reduces developers' efforts when writing individual code for each gateway. It provides flexibility to write code once for all gateways. It also excludes the effort of reading docs for individual gateways. ... While working on J2pay, you will always be passing and retrieving JSON. Yes, no matter which format is native to a gateway API, you will always be using JSON, and for that, I used the org.json library. You do not have to worry about gateway-specific variables like some gateways returning transaction IDs as transId or transnum. Rather, J2Pay will always return transactionId, and it will also give you the same formatted response no matter what gateway you are using. My first and favorite point is that you should not need to read the gateway's documentation because a developer has already done that for you (maybe you are that developer who integrated the gateway).


How to master GDPR compliance with enterprise architecture

With the complexity of modern IT services and the increasing amount of data obtained by companies today, it’s not uncommon to lose visibility into everywhere information exists — and for data to float to unexpected areas — especially within large organizations. The first step towards achieving full compliance is establishing a clear view of your data — where it lives, how your company processes it and how to quickly access it to make key changes. While a daunting and time-consuming task, leveraging EA and application portfolio management (APM) tools can help you gain full visibility into your organization’s data landscape. Regardless of your existing EA sophistication, taking an application-centered approach will create a strong foundation for success. First, identify all existing applications inside of the organization. Use surveys of application owners to uncover which applications involve personal data as defined by the GDPR, ensure that consent has been received by all data subjects and identify all business capabilities that use the impacted applications.



Quote for the day:


"A leader is always first in line during times of criticism and last in line during times of recognition." -- Orrin Woodward