Daily Tech Digest - July 15, 2018

“Enterprise Architecture As A Service” – What?


Recent success results in organizations having to deal with big decisions on ways to invest and maintain their success. Perceived failure results in a need to make decisions to address the failures. Each of these scenarios gets attention during the strategic planning process and, as pointed out in “Enterprise Architecture as Strategy” by Jeanne W. Ross, Peter Weill, and David Robertson, Harvard Business School Press, 2006, EA is a useful tool. The bottom line is that big decisions are looming and there is a perception that EA can help by defining “the organizing logic for business processes and IT Infrastructure, reflecting the integration and standardization requirements of the company’s operating model” so that “individual projects can build capabilities – not just fulfill immediate needs”. But there is another, less positive, perception out there – EA can be a money sink! It could result in tons of paper, take years, result in something outdated by the time it is finished, just to name a few concerns. Also the need for change has a timeline shorter than the perceived timeline of generating an Enterprise Architecture. 


HTC’s blockchain phone is real, and it’s arriving later this year

phone-components_desktop
Prior to the launch, the company is partnering with the popular blockchain title, CryptoKitties. The game will be available on a small selection of the company’s handsets starting with the U12+. “This is a significant first step in creating a platform and distribution channel for creatives who make unique digital goods,” the company writes in a release tied to the news. “Mobile is the most prevalent device in the history of humankind and for digital assets and dapps to reach their potential, mobile will need to be the main point of distribution. The partnership with Cryptokitties is the beginning of a non fungible, collectible marketplace and crypto gaming app store.” The company says the partnership marks the beginning of a “platform and distribution channel for creatives who make unique digital goods.” In other words, it’s attempting to reintroduce the concept of scarcity through these decentralized apps. HTC will also be partnering with Bitmark to help accomplish this. If HTC is looking for the next mainstream play to right the ship, this is emphatically not it.


Interview: Bill Waid talks about AI ML


What is interesting about this well-known and often referenced use of AI/ML, is the potential opportunity cost. Despite the significant savings realized, the impact of declining a customer transaction that was not fraudulent leads to and even more costly unsatisfactory customer engagement and eventual attrition. To operationalize this AI/ML solution and fully realize the value, decisioning and a continuous improvement feedback loop was required. Capitalizing on the power of AI/ML, FICO has expanded both the algorithms and application of AI/ML to a broad set of solutions since 1992. Most notable is the use of ML to find predictive patterns in the ever-expanding Data Lakes our clients are collecting and using those ML findings to augment existing decisions and incrementally improve business outcomes. By deploying ML models in a way that the decision outcome could be managed and monitored to ensure accuracy, business owners could learn from the ML model and gain confidence that the model was indeed providing tangible improvement. This last innovation was a natural evolution to what FICO refers to as explainable AI (xAI).


How AI will change your healthcare experience (but won’t replace your doctor)

AI in healthcare
Techniques such as machine learning enable healthcare providers to analyze large amounts of data, allowing them to do more in less time, and supporting them with diagnosis and treatment decisions. For example, suppose you feed a computer program with a large amount of medical images that either show or do not show symptoms of a disease. The program can then learn to recognize images that may point towards the disease. For example, researchers at Stanford developed an algorithm that helps to evaluate chest X-rays to identify images with pneumonia. This doesn’t mean, however, that the radiologist will no longer be needed. Instead, think of AI as a smart assistant that will support doctors, alleviating their workload. This is also how we approach AI at Philips: we work together with clinicians to develop solutions that make their lives easier and improve the patient experience. That’s why we believe in the power of adaptive intelligence. It’s not really about AI per se – it’s about helping people with technology that adapts to their needs and extends their capabilities.


Machine learning will redesign, not replace, work

"Any manager could take this rubric, and if they're thinking of applying machine learning this rubric should give them some guidance," he said. "There are many, many tasks that are suitable for machine learning, and most companies have really just scratched the surface." ... Since a job is just a bundle of various tasks, it's also possible to use the rubric to measure the suitability of entire occupations for machine learning. Using data from the federal Bureau of Labor Statistics, that's exactly what they did—for each of the more than 900 distinct occupations in the U.S. economy, from economists and CEOs to truck drivers and schoolteachers. "Automation technologies have historically been the key driver of increased industrial productivity. They have also disrupted employment and the wage structure systematically," the researchers write. "However, our analysis suggests that machine learning will affect very different parts of the workforce than earlier waves of automation … Machine learning technology can transform many jobs in the economy, but full automation will be less significant than the reengineering of processes and the reorganization of tasks."


Reinventing The Enterprise - Digitally

Through autonomization and emergence, self-tuning firms create significant advantages. They can better understand customers by leveraging data from their own ecosystems and platforms to develop granular insights and automatically customize their offerings. They can develop more new, marketable products by experimenting with offerings and leveraging proprietary data. And they can implement change more quickly and at lower cost by acting autonomously.  The benefits of autonomization and emergence well exceed those that can be realized from digitization programs aiming to increase efficiency or product innovation alone. They are compounded by self-reinforcing network and experience effects: better offerings attract more customers and more data; experimentation brings knowledge that increases the value of future experimentation. One example of a self-tuning organization is Alibaba. Not only does its e-commerce platform provide a sea of user data, but the company uses it to generate real-time insights in a granular manner.


Two studies show the data center is thriving instead of dying

data center
The top reasons for such investment are security and application performance (75% of respondents) and scalability (71%). It also found that 53% of respondents intend to increase investment in software-defined storage, 52% in NAS and 42% in SSD ... IHS noted that new technologies such as artificial intelligence and containers are gaining traction, traditional data center apps, such as Microsoft Office (22%), collaboration tools such as email, SharePoint, and unified communications (18%), and general-purpose IT apps (30%) are still being used. The second survey comes from SNS Telecom & IT, a market research firm based in Dubai, UAE. It attributes the growth in big data and the subsequent massive inflow of all sorts of unstructured data as the reason for investment in IT equipment by the financial services industry. “As this Big Data construct expands to include streaming and archived data along with sensor information and transactions, the financial sector continues its steady embrace of big data analytics for high-frequency trading, fraud detection and a growing list of consumer-oriented applications,” said the authors.


Despite the security measures you've taken, hacking into your network is trivial

Closing security vulnerabilities and establishing effective cybersecurity policies and procedures is going to require more than just better technology. Effective security will demand a complete change of attitude by every employee, executive, and individual operating a computing device. Security must become the priority, even at the expense of convenience. Confirming results reported in other studies, the Positive Technologies research showed that more than a quarter of employees still inexplicably clicked a malicious link sent to them in an email. Despite extensive training and retraining, employees--regardless of industry or level of technical knowledge--continue to operate with an almost unconscious lack of security awareness. Until this cavalier attitude toward protecting company data changes, phishing attacks and authentication circumvention will continue to plague the modern enterprise.


The Economics Of AI - How Cheaper Predictions Will Change The World


Key to this, they argue, will be whether human AI “managers” can learn to differentiate between tasks involving prediction, and those where a more human touch is still essential. When I met with Joshua Gans – professor of strategic management and holder of the Jeffrey S Skoll Chair of Technical Innovation and Entrepreneurship at the University of Toronto – he gave me some insight into how economists are tackling the issues raised by AI. "As economists studying innovation and technological change, a conventional frame for trying to understand and forecast the impact of new technology would be to think about what the technology really reduces the cost of," he tells me. "And really its an advance in statistical methods – a very big advance – and really not about intelligence at all, in a way a lot of people would understand the term ‘intelligence.' ... “When I look up at the sky and see there are grey clouds, I take that information and predict that it’s going to rain. When I’m going to catch a ball, I predict the physics of where it’s going to end up. I have to do a lot of other things to catch the ball, but one of the things I do is make that prediction.”


Creating a Defensible Security Architecture

Controls should not only face the Internet but implemented to secure authorized access from internal assets to internal assets. Basic adjustments such as this allow for far superior prevention controls and, more importantly, detection controls. Think about this for a moment: If a computer on a subnet or zone A attempts to talk to any system found in zone B and the system from A is not allowed, then the connection will be denied, and you will be notified of that. Basic firewall rules aren't rocket science, but they are highly effective controls. Modern challenges also must be overcome. For instance, consider an intrusion detection/prevention device, web proxy, data loss prevention sensor, network antivirus, or any other Layer 7 network inspection solution. These are all crippled by network encryption. Your brand-new shiny NGFW may not be configured to handle 70%+ of the traffic going through it. Basically, without understanding technologies like Secure Sockets Layer (SSL) inspection, SSL decrypt mirroring, HTTP Strict Transport Security (HSTS), certificate transparency, HTTP Public Key Pinning (HPKP), how can you handle modern encryption?



Quote for the day:


"Technology makes it possible for people to gain control over everything, except over technology." -- John Tudor


Daily Tech Digest - July 14, 2018


To date, the tools which underpin workforces have been developed as a natural extension of traditional work flows. Email replaced the memo, and video chat made the conference call more collaborative. But emerging technologies like advanced analytics, artificial intelligence, and machine learning are primed to provide a comprehensive look into the patterns and intricacies that make up the individual workplace experience. For example, with the right platform, IT departments can better understand which channels employees prefer, what is drawing them to these channels, and how they can better optimize it for even further productivity. Alternatively, they can identify problem areas within work flows and proactively ease the strain on employees themselves. As technology becomes more advanced, the human element becomes increasingly vital. Digital transformation saw a seismic shift in the way IT leaders approach their infrastructure, but workplace transformation requires a deep understanding of the unique ways individuals approach productivity.


Entity Services Increase Complexity

Entity services are modelled after defined entities (or nouns) within a system. For example, an accounts service, order service and customer service. Typically they have CRUD like interfaces which operate on top of these entities.  By taking this CRUD like approach, entity services tend to not contain any meaningful business functionality. Instead, they are shallow modules, not really offering any complex or useful abstractions. ... Ultimately, these shallow entity services can turn into a cluster of highly coupled components, write Abedrabbo. This leads to an operational burden, where more components must be deployed, scaled and monitored. This high coupling can also lead to challenging release processes, where many microservices must be deployed in order to deliver a single piece of functionality.  It can also produce single points of failure, where many services depend on each other, meaning that if one fails it can bring down the entire system. Abedrabbo also explains that entity services create conceptual complexity, as the knowledge of how to compose them is not immediately obvious. 


 An exciting time to be in cyber security innovation


There is a wide range of initiatives specifically around cyber security in the UK, says Chappell, including the Cyber Growth Partnership, which supports fast-growing security companies. “There are some great opportunities is this sector, which is partly due to our UK heritage going back to Bletchley Park,” he says. The UK also benefits from having top students from all over the world who come to further their education, a thriving financial sector and a strong defence sector. “We are lucky to have this heady mix of components that create an environment where it is great to be building a business,” says Chappell. Also, thanks to the likes of companies such as Message Labs and Sophos, the UK has useful templates or archetypes for fast-growing successful businesses that startups can draw upon, he adds. The growing number of incubators is also creating opportunities for cyber security innovators, with Lorca being the latest to join its sister centre in Cheltenham, the NCSC Cyber Accelerator, CyLon and its HutZero bootcamp for entrepreneurs.


Reddit Co-Founder Alexis Ohanian's Top Self-Care Strategies for Entrepreneurs

“Entrepreneurs have to have enough ego to think that our crazy idea, our vision for the future is going to work, before anyone else does. But [it’s important to] balance that with enough humility to know that you aren’t going to have all the answers,” Ohanian says. “You are going to need to rely on different points of view. Get the benefit of someone who is detached enough to give you honest feedback, but attached enough to know all the players and background information.” Ohanian’s feeling is, if you wouldn’t expect a talented athlete or sports team to play without their coach, why shouldn’t it be the same for a great entrepreneur? ... “One of the things founders and CEOs in particular should always be doing and keeping top of mind is celebrating those wins for their business,” Ohanian says. “It will never feel like a 100 percent win for the CEO or founder, because you’re always thinking about the 100 other things that need to get improved or fixed. But for all the people on your team, it is really vital to celebrate them and that success. Not in a way that gets people complacent, but rejuvenated and re-excited about the mission and vision.”


Why You Should Consider A Career In Cybersecurity


Cybersecurity professionals are generally among the most highly-compensated technology workers. According to the United States Department of Labor, the median annual wages for information security analysts is almost $100,000 nationally, with many jobs in various locations paying considerably higher. With the demand for cybersecurity professionals continuing to far outpace the supply, salaries are likely to continue rising. As such, investing in cybersecurity training now can pay off quite handsomely ... For multiple reasons, many companies are far less likely to let go of cybersecurity professionals than they would other employees. Shrinking the security team may increase the likelihood of a breach, and can dramatically increase the impact of a breach should one occur; think for a moment about customers’ and regulators’ reactions to news reports that “A large amount of personal data leaked after company X tried to save money by reducing its cybersecurity staff.” Of course, as alluded to before, another deterrent against letting information security professionals go is that employers know that it is often both difficult and expensive to find suitable replacements.


Let There Be Sight: How Deep Learning Is Helping the Blind ‘See’

Guide dogs are great for helping people who are blind or visually impaired navigate the world. But try getting a dog to read aloud a sign or tell you how much money is in your wallet. Seeing AI, an app developed by Microsoft AI & Research, has the answers. It essentially narrates the world for blind and low-vision users, allowing them to use their smartphones to identify everything from an object or a color to a dollar bill or a document. Since the app’s launch last year, it’s been downloaded 150,000 times and used in 5 million tasks, some of which were completed on behalf of one of the world’s most famous blind people. “Stevie Wonder uses it every day, which is pretty cool,” said Anirudh Koul, a senior data scientist with Microsoft, during a presentation at the GPU Technology Conference in San Jose last month. A live demo of the app showed just how powerful it can be. Koul had a colleague join him on stage, and when he launched the app on his smartphone and pointed it toward his co-worker, it declared that it was looking at “a 31-year-old man with black hair, wearing glasses, looking happy.”


Graphing the sensitive boundary between PII and publicly inferable insights

window-1231894_1280-geralt-pixabay
There is a fuzzy boundary between information that’s personally identifiable and insights about persons that are publicly inferable. GDPR and similar mandates only cover protection of discrete pieces of digital PII that that are maintained in digital databases and other recordkeeping systems. But some observers seem to be arguing that it also encompasses insights that might be gained in the future about somebody through analytics on unprotected data. That’s how I’m construing David Loshin’s statement that “sexual orientation [is] covered under GDPR, too.” My pushback to Loshin’s position is to point out that it’s not terribly common for businesses or nonprofits to record people’s sexual orientation, unless an organization specifically serves one or more segments of the LGBTQ community — and even then, it’s pointless and perhaps gauche and intrusive to ask people to declare their orientation formally as a condition of membership. So it’s unlikely you’ll find businesses maintaining PII profile records stating that someone is gay, lesbian, bisexual or whatever.


Ultimate Guide To Blockchain In Insurance

Within insurance, the claims and finance functions are high-value areas where blockchain could be beneficial, especially when you look at processes that need ongoing reconciliation with external parties. Consider how often Company A has a claim against Company B resulting in the exchange of money, typically in the form of a paper check or an electronic transaction. That could be completely automated using blockchain. Presently, many insurers are applying a smart contract alongside the blockchain, which is triggered when well-defined terms and conditions are met. By setting up an insurance contract that pays out under these circumstances, an insurer can process transactions with no human intervention and greatly enhanced customer service. In other words, blockchain can help deliver on the digital opportunities that insurers must get right. These opportunities aren’t glamorous but they’re important: as I’ve said before, get them right and you won’t win—but get them wrong and you will lose. Blockchain can help insurers deliver on some brilliant basics.


Preparing Your Business For The Artificial Intelligence Revolution


Artificial intelligence can be used to solve problems across the board. AI can help businesses increase sales, detect fraud, improve customer experience, automate work processes and provide predictive analysis. Industries like health care, automotive, financial services and logistics have a lot to gain from AI implementations. Artificial intelligence can help health care service providers with better tools for early diagnostics. The autonomous cars are a direct result of improvements in AI. Financial services can benefit from AI-based process automation and fraud detection. Logistics companies can use AI for better inventory and delivery management. The retail business can map consumer behavior using AI. Utilities can use smart meters and smart grids to decrease power consumption. The rise of chatbots and virtual assistants are also a result of artificial intelligence. Amazon's Alexa, Google's Home, Apple's Siri and Microsoft's Cortana are all using AI-based algorithms to make life better. These technologies will take more prominent roles in dictating future consumer behavior.


Prime Minister Of Luxembourg Xavier Bettel On Technology, Culture And People

“Current” is always a bit of a difficult word when it comes to technology, because innovative ideas or products often grow and mature in waves. Consequently, over time, new technologies experience highs during which they are heavily publicized and on everybody’s mind. They also go through lows, during which they appear to be completely forgotten. Yet, the research continues! Having said that, I am actually very fond of the world of virtual and augmented reality. Yes, the technology, or at the very least the idea and concepts of VR and AR, have been around for quite some time now. But it is truly exciting to discover all the new opportunities these technologies offer us thanks to the recent advances in computing power, be it in the medical domain, in education, in transport…they make our world better and safer! ... In order to reap the full potential of our digital economy, European rules must ultimately enable and encourage our businesses and citizens to buy and sell their services and products anywhere in the European Union.



Quote for the day:


"The problem isn't a shortage of opportunities; it's a lack of perspective." -- Tim Fargo


Daily Tech Digest - July 13, 2018

Bill Hoffman of the Industrial Internet Consortium talks AI, IoT and more
Price point, availability, wired or wireless, then the fact that you can hook them up to the Ethernet – IPv6 provides almost no degradation of performance – so you can put a lot of stuff on it, which we also couldn’t do 30 years ago. We were all running Novell local area nets at the time! Who remembers Novell? So I think the technology has become much more robust and available, such that we’re able to use the big data and apply predictive analytics, and use these things in industrial systems that we couldn’t have even dreamed of 20 years ago. And when we say “industrial” [Internet of Things], the “Industrial Internet” is really an “industry” Internet, not just manufacturing per se. “Industrial Internet” was actually a term of art GE had coined, back in, I believe, 2013, and they didn’t trademark it intentionally, because they wanted it to remain a term of art. So when Richard [Soley] and I sat around the table with the five founders [of the Industrial Internet Consortium], we had hours of discussion about what to call this new entity we were going to create.



Cryptocurrency Exchange Developer Bancor Loses $23.5 Million

Some of Bancor's losses, however, are recoverable. Bancor says it has recouped $10 million worth of BNT, a type of token that facilitates trades within its exchange. How Bancor executed that recovery, however, leads into a heated debate among cryptocurrency enthusiasts. BNT differs from bitcoin in that it is a centrally generated token. New bitcoins are created through a process called mining, in which computers that verify transactions on the network are rewarded with a slice of bitcoin. BNT, like many other cryptocurrencies such as Ripple's XRP, Cardano's Ada, Block.one's EOS and Stellar's Lumens, isn't mined. These types of coins have powered Initial Coin Offerings, where an organization creates a centrally issued coin and sells it to raise funding. ICOs, which some contend could expose investors to fraud, are being closely analyzed by regulators around the world. The question is whether the coins or tokens that are issued are more like securities akin to stocks rather than an asset. The sale of securities often entails a different set of stricter trading rules


Peer Reviews Either Sandbag or Propel Agile Development


First, peers likely provide valuable feedback and have fresh eyes to catch mistakes that you might miss after spending hours working. Second, working on a fast-moving Agile team, you need to continually build consensus so that there is not a communication backlog. Lastly, for teams working in highly-regulated industries, peer reviews may be a required piece of a larger software assurance program. As more software development teams trend toward an Agile approach, software releases are becoming more frequent. If you are not able to speed up your peer review cycles in tandem, you may start to sacrifice quality to hit deadlines. That then translates to a buildup of technical debt. How can you avoid this scenario? It takes structure, but flexible structure. ... Most teams don’t have an explicit plan around their internal communications. The tools that they employ typically dictate the communication norms. If your team adopts Slack or another messaging app, then it quickly becomes common for folks to have short, timely chats. The expectation is that the other person replies within a relatively short timeframe.


Doing Performance Testing Easily using JUnit and Maven

Sometimes, we tend to think that performance testing is not part of the development process. This is probably due to no stories getting created for this, during the usual development sprints. This means the important aspect of a product or service APIs is not not taken care of. But that's not the point, the point is why do we think that it should not be part of the usual development cycle ? or... Why do we keep this towards the end of the project cycle? Also to add more ground to the above thinking, there are no straight forward approaches to doing performance testing like we do unit testing or feature/component testing or e2e integration testing or consumer-contract testing. Then the developers or the performance-testers (sometimes a specialized team) are asked to choose a standalone tool from the market place and produce some fancy reports on performance testing, share those reports with business or technology team. That means it is done in isolation and sometimes after or towards the end of the development sprints, approaching the production release date.


Government Bodies Are At Risk Online

Commitment to Online Trust and Security
Busy government staff don’t always have the time to learn cybersecurity best practice. Government employees working in departments such as planning, finance, human resources and the administration staff that support them, have intense workloads – so it’s important they can work quickly and efficiently, without compromising their safety online. It’s thought that as many as 95% of successful online hacks come down to human error. Mistakes are made by those who aren’t educated in online risks and can’t spot threats to their data. Sometimes it’s not a lack of knowledge, but a problem with relying solely on human performance. Even the most educated person can make mistakes that cause huge data breaches. Government organisations need to limit the risk of human error as much as possible. If it’s a case of staff reusing static or simple passwords that can be stolen using brute force attacks, then 2FA can be a solution. Once it has been used, successfully or unsuccessfully, then it becomes invalid.


Building the future of retail with the Internet of Transport

Wincanton wants to use sensors to automatically alert its employees to any potential deterioration in products during transportation. As part of this project, Gifford says the firm's technological efforts have produced developments in three key areas so far. He points first to Winsight, an app that enables a paperless cab, so all the paper lorry drivers normally carry, such as routes and proof of delivery, is wrapped up into a single piece of software on a smart device. The app is available to the firm's own drivers and sub-contractors. The second key element is telematics. "That's about us plugging into the vehicle's systems and sending information back to the business in a consistent way," says Gifford. Wincanton recently announced it will install MiX telematics in 1,800 of its vehicles as part of an ongoing safety programme, with information used to optimise driver performance. The final element is the implementation of a new, cloud-based transport management system (TMS). This TMS will form the basis for the firm's digital supply-chain strategy, with telematics helping to hone operational performance and Winsight helping to ensure business efficiency and effectiveness.


Here come the first blockchain smartphones: What you need to know

Sirin blockchain phones
It appears the world's third-biggest handset maker may win a race to become the industry's first to offer a blockchain smartphone; Swiss-based Sirin Labs announced its own $1,000 smartphone and $800 all-in-one PC with native blockchain capabilities last October; it scheduled the release for this September, according to reports. HTC, however, plans to release its phone this quarter. HTC's blockchain phone has already received "tens of thousands" of reservations globally, Phil Chen, the chief crypto officer at HTC, said in an interview during the RISE conference in Hong Kong this week. Like HTC's upcoming $1,000 Exodus blockchain smartphone, Sirin's Finney smartphone will come with a built-in cold-storage crypto wallet for storing bitcoin, Ethereum and other digital tokens, and it will run on open-source, feeless blockchain. Sirin was able to raise more than $100 million in an initial coin offering for the Android-based Finney smartphone and PC. Both will run Sirin's open-source operating system, SIRIN OS.


Apache Mesos and Kafka Streams for Highly Scalable Microservices

Apache Mesos is a cluster manager that provides efficient resource isolation and sharing across distributed applications or frameworks. It sits between the application layer and the operating system. This makes it easy and efficient to deploy and manage applications in large-scale clustered environments. Apache Mesos abstracts away data center resources to make it easy to deploy and manage distributed applications and systems. DC/OS is a Mesosphere-backed framework on top of Apache Mesos. As a datacenter operating system, DC/OS is itself a distributed system, a cluster manager, a container platform, and an operating system. DC/OS has evolved a lot in the past couple of years, and supports new technologies like Docker as its container runtime or Kubernetes as its orchestration framework. As you can imagine from this high-level description, DC/OS is an first-class choice infrastructure to realize a scalable microservice infrastructure.


Your Roadmap to an Open Mobile Application Development Strategy


An MADP allows a business to rapidly build, test and deploy mobile apps for smartphones and tablets. It can minimize the need for coding, integrate building-block services, such as user management, data management and push notifications, and deliver apps across a broad array of mobile devices. The result is a common and consistent approach, so developers can customize their apps without worrying about back-end systems or implementation details. Michael Facemire, principal analyst at Forrester Research, observed in an analysis of Mobile Development Platforms that companies fall into two camps: “those that prefer an all-inclusive platform”, who represent the greatest, though waning part of platform spend today, and “those that prefer to manage a collection of services”. The first group of customers work with large infrastructure vendors, such as IBM, Oracle, and SAP, who offer complete environments for development, delivery, and management of mobile applications. They benefit from platform stability and custom support, but may struggle compared with other platforms when building mobile experiences outside of their proprietary ecosystems.


Hacker-powered security is reaching critical mass

“Crowdsourced security testing is rapidly approaching critical mass, and ongoing adoption and uptake by buyers is expected to be rapid,” Gartner reported. Governments are leading the way with adoption globally. In the government sector there was a 125 percent increase year over year with new program launches including the European Commission and the Ministry of Defense Singapore, joining the U.S. Department of Defense on HackerOne. Proposed legislations like Hack the Department of Homeland Security Act, Hack Your State Department Act, Prevent Election Voting Act, and the Department of Justice Vulnerability Disclosure Framework further demonstrate public sector support for hacker-powered security. Industries beyond technology continued to increase share of the overall hacker-powered security markets. Consumer Goods, Financial Services & Insurance, Government, and Telecommunications account for 43 percent of today’s bug bounty programs. Automotive programs increased 50% in the past year and Telecommunications programs increased 71 percent.



Quote for the day:


"Experience without theory is blind, but theory without experience is mere intellectual play." -- Immanuel Kant


Daily Tech Digest - July 12, 2018

WAFs Should Do A Lot More Against Current Threats Than Covering OWASP Top 10
Organizations invest to increase network capacity, ultimately accommodating fictitious demand. Accurate distinction between human traffic and bot-based traffic, and between “good” bots (like search engines and price comparison services) and “bad” bots, can translate into substantial savings and an uptick in customer experience. The bots won’t make it easy on you now as they can mimic human behavior, bypass CAPTCHA and other challenges. Moreover, dynamic IP attacks render IP-based protection ineffective. Often times, open source dev tools (Phantom JS for instance) that can process client-side JavaScript are abused to launch brute-force, credential stuffing, DDoS and other automated bot attacks. To manage the bot-generated traffic effectively, a unique identification (like a fingerprint) of the source is required. Since bot attacks use multiple transactions, the fingerprint allows organizations to track suspicious activity, attribute violation scores and make an educated block/allow decision at a minimum false-positives rate.



Why your whole approach to security has to change imageNew technology philosophies like DevOps and Agile provide the opportunity to build security into the whole lifecycle that exists around IT use. By embedding proper security processes around cloud resources, companies can make their workflows deliver security into the fabric of this new architecture from the start. Getting this degree of oversight and security in place involves making security goals and objectives clear to everyone, while also enabling those processes to run smoothly and effectively. It involves making security management into more than just a blocker for poor software; instead, it is about making services available quickly within those workflows. This process is termed transparent orchestration. Transparent orchestration involves a re-wiring of security to match how this IT infrastructure has been rebuilt. As part of this, security must be automatically provisioned across a complete mix of internal and external networks, spanning everything from legacy data centre IT through to multi-cloud ecosystems and new container-based applications.



Top 3 practical considerations for AI adoption in the enterprise

Artificial intelligence computer brain circuits electronics grid
Explainable AI centers on the ability to answer the question, “Why?” Why did the machine make a specific decision? The reality is many new versions of AI that have emerged have an inherent notion of a “black box.” There are many inputs going into the box, and then out of it comes the actual decision or recommendation. However, when people try to unpack the box and figure out its logic, it becomes a major challenge. This can be tough in regulated markets, which require companies to disclose and explain the reasoning behind specific decisions. Further, the lack of explainable AI can affect the change management needed throughout the company to make AI implementations succeed. If people cannot trace an answer to an originating dataset or document, it can become a hard value proposition to staff. Implementing AI with traceability is a way to address this challenge. For example, commercial banks manage risk in an online portfolio. A bank may lend money to 5,000 small- to medium-sized businesses. It will monitor their health in balance sheets within the portfolio of loans. These sheets may be in different languages or have different accounting standards.



Blockchain as a Re-invention of the Business
The current model is very much authority-centric, which leaves a narrow space to individuals and small legal entities who are mere spectators or limited contributors. If we take into account the democratization of choice that boosts up the people’s motivation nowadays, we can come up to the conclusion that the current approach stands against the right to self-empowerment. Blockchain, a wonderful combination of mathematics and technology made it possible to distribute the power into the nations where it actually belongs. “With great power comes great responsibility," as the Marvel comics super-heroes use to say. This could be a motto of DLT. Blockchain is a natural service area that hooks up to the Internet as the connectivity layer. Business globalization and economic freedom are two main forces of a paramount significance underpinning the evolution of the distributed transactional platform. The central system played an absolutely vital role in times of corruption and global disorders or wars. In the current reality, people deserve to operate within a planetary technological network.


A Look at the Technology Behind Microsoft's AI Surge

Lambda architecture, while a general computing concept, is built into the design of Microsoft's IoT platform. The design pattern here focuses on managing large volumes of data by splitting it into two paths -- the speed path and the batch path. The speed path offers real-time querying and alerting, while the batch path is designed for larger data analysis. While not all AI scenarios use both of these paths, this is a very common edge computing pattern. At the speed layer, Azure offers two main options -- Microsoft's own Azure Stream Analytics offering and the open source Apache Kafka, which can be implemented using the HDInsight Hadoop as a Service (HDaaS) offering, or on customers' own virtual machines (VMs). Both Stream Analytics and Kafka offer their own streaming query engines (Steam Analytics' engine is based off of T-SQL). Additionally, Microsoft offers Azure IoT and Azure Event Hubs, which connect edge devices (such as sensors) to the rest of the architecture. IoT Hubs offer a more robust solution with better security; Event Hubs are specifically designed just for streaming Big Data from system to system.




U.S. regulators grappling with self-driving vehicle security
U.S. Transportation Secretary Elaine Chao said in San Francisco on Tuesday that “one thing is certain — the autonomous revolution is coming. And as government regulators, it is our responsibility to understand it and help prepare for it.” She said “experts believe AVs can self-report crashes and provide data that could improve response to emergency situations.” One issue is if self-driving vehicles should be required to be accessible to all disabled individuals, including the blind, the report noted. The Transportation Department is expected to release updated autonomous vehicle guidance later this summer that could address some of the issues raised during the meetings. Automakers, Waymo, a unit of Alphabet Inc, and other participants in the nascent autonomous vehicle industry have called for federal rules to avoid a patchwork of state regulation. However, the process of developing a federal legal framework for such vehicles is slow moving.


The rise of artificial intelligence DDoS attacks
ddos attack
The major turning point in the evolution of DDoS came with the automatic spreading of malware. Malware is a phrase you hear a lot of and is a term used to describe malicious software. The automatic spreading of malware represented the major route for automation and marked the first phase of fully automated DDoS attacks. Now, we could increase the distribution and schedule attacks without human intervention. Malware could automatically infect thousands of hosts and apply laterally movement techniques infecting one network segment to another. Moving from network segments is known as beacheading and malware could beachhead from one part of the world to another. There was still one drawback. And for the bad actor, it was a major drawback. The environment was still static, never dynamically changing signatures based on responses from the defense side. The botnets were not variable by behavior. They were ordered by the C&C servers to sleep and wake up with no mind for themselves. As I said, there is only so much bandwidth out there. So, these type of network attacks started to become less effective.



Automation could lift insurance revenue by $243 billion
First, explaining the vision clearly and securing leadership buy-in. “By establishing a clear and compelling vision, organizations demonstrate that intelligent automation is a strategic imperative and are able to answer critical questions,” the report says. Second, developing a clear pilot process. “The automation business case will need to assess the impact on transaction processing time and employee time saved and consider variables such as the volume of transactions or the number of exceptions in a specific process,” according to Capgemini. Firms should also consider starting with “low-hanging fruit” and engaging talent through hackathons and accelerators, the report says. Third, scaling up with an automation center of excellence. “To promote effective collaboration with the CoE, organizations should consider incentivizing functions based on business benefits derived from implementation of intelligent automation,” Capgemini suggests. Fourth, industrializing automation.


In-memory computing: enabling continuous learning for the digital enterprise
speech bubble constructed from abstract letters
Today’s in-memory computing platforms are deployed on a cluster of servers that can be on-premises, in the cloud, or in a hybrid environment. The platforms leverage the cluster’s total available memory and CPU power to accelerate data processing while providing horizontal scalability, high availability, and ACID transactions with distributed SQL. When implemented as an in-memory data grid, the platform can be easily inserted between the application and data layers of existing applications. In-memory databases are also available for new applications or when initiating a complete rearchitecting of an existing application. The in-memory computing platform also includes streaming analytics to manage the complexity around dataflow and event processing. This allows users to query active data without impacting transactional performance. This design also reduces infrastructure costs by eliminating the need to maintain separate OLTP and OLAP systems.



Hospital Diverts Ambulances Due to Ransomware Attack
The ransomware attack Monday impacted the enterprise IT infrastructure, including the electronic health records system, at Harrisonville, Mo.-based Cass Regional Medical Center, which includes 35 inpatient beds and several outpatient clinics, a spokeswoman tells Information Security Media Group. As of Wednesday morning, about 70 percent of Cass' affected systems were restored, she says. Except for diverting urgent stroke and trauma patients to other hospitals "out of precaution," Cass Regional has continued to provide inpatient and outpatient services for less urgent situations as it recovers from the attack, she says. "We've gone to our downtime processes," she says, which include resorting to the use of paper records while the hospital's Meditech EHR system is offline during the restoration and forensics investigation, she says. The hospital is working with an unnamed international computer forensics firm to decrypt data in its systems, she adds, declining to disclose the type of ransomware involved in the attack or whether the hospital paid a ransom to obtain a decryption key from extortionists.


Quote for the day:

"The mediocre leader tells. The good leader explains. The superior leader demonstrates. The great leader inspires." -- Gary Patton

Daily Tech Digest - July 11, 2018

Georgia Tech report outlines the future of smart cities

Georgia Tech report outlines the future of smart cities
One key point researchers made is that IoT deployed in public spaces – in collaboration between city governments, private enterprise and citizens themselves – has a diverse group of stakeholders to answer to. Citizens require transparency and rigorous security and privacy protections, in order to be assured that they can use the technology safely and have a clear understanding of the way their information can be used by the system. The research also drilled down into several specific use cases for smart city IoT, most of which revolve around engaging more directly with citizens. Municipal services management offerings, which allow residents to communicate directly with the city about their waste management or utility needs, were high on the list of potential use cases, along with management technology for the utilities themselves, letting cities manage the electrical grid and water system in a more centralized way. Public safety was another key use case – for example, the idea of using IoT sensors to provide more accurate information to first responders in case of emergency.



10 Tips for Managing Cloud Costs

Part of the reason why cost management is so challenging is because organizations are spending a lot of money on public cloud services. More than half of enterprises (52%) told RightScale that they spend more than $1.2 million per year on clouds services, and more than a quarter (26%) spend over $6 million. That spending will likely be much higher next year, as 71% of enterprises plan to increase cloud spending by at least 20%, while 20% expect to double their current cloud expenditures. Given those numbers, it's unsurprising that Gartner is forecasting that worldwide public cloud spending will "grow 21.4% in 2018 to total $186.4 billion, up from $153.5 billion in 2017." Another problem that contributes to cloud cost management challenges is the difficulty organizations have tracking and forecasting usage. The survey conducted for the SoftwareONE Managing and Understanding On-Premises and Cloud Spend report found that unpredictable budget costs was one of the biggest cloud management pain points for 37% of respondents, while 30% had difficulty with lack of transparency and visibility.


How to Receive a Clean SOC 2 Report

How to Receive a Clean SOC Report
Having a documented control matrix will be beneficial for more than just compliance initiatives; it becomes your source for how risk controls are developed and implemented and can be useful for augmenting corporate information security policies. For SOC 2, the control matrix becomes an important reference document for auditors. For instance, Trust Services Criteria 4 relate to monitoring of controls, so creating a list of how your organization is confirming controls are well designed and operating effectively makes it easy for auditors to validate that your stated controls are in place, designed to meet your security and confidentiality commitments, and are effective in doing so. Here is a concrete example: A control in your environment says servers need to be hardened to CIS benchmarks. How are you evaluating the effectiveness of this control? Are the servers hardened to your specification before going into production? Are they meeting benchmarks on an ongoing basis? An easy way to meet the monitoring requirement is to use a tool like Tripwire Enterprise.


Most Enterprise of Things initiatives are a waste of money

Most Enterprise of Things initiatives are a waste of money
What’s truly needed is a consolidated ability to capture and process all of the data and convert it into meaningful insights. Many companies provide analytics engines to do this (e.g., SAP, Google, Oracle, Microsoft, IBM, etc.). But to have truly meaningful company-wide analysis, a significantly more robust solution is needed than stand-alone, singular instances of business intelligence/analytics. How should companies enable the full benefits of EoT? They need a strategy that provides truly meaningful “actionable intelligence” from all of the various data sources, not just the 15 to 25 percent that is currently analyzed. That data must be integrated into a consolidated (although it may be distributed) data analysis engine that ties closely into corporate backend systems, such as ERP, sales and order processing, service management, etc. It’s only through a tightly integrated approach that the maximum benefits of EoT can be accomplished. Many current back-office vendors are attempting to make it easier for companies to accomplish this. Indeed, SAP is building a platform to integrate EoT data into its core ERP offerings with its Leonardo initiative.


Randy Shoup Discusses High Performing Teams

It is estimated that the intelligence produced by Bletchley Park, code-named "Ultra", ended the war two years early, and saved 14 million lives. ... Although the Bletchley Park work fell under the domain of the military, there was very little hierarchy, and the organisational style was open. The decryption was conducted using a pipeline approach, with separate "huts" (physical buildings on the campus) performing each stage of intercept, decryption, cataloguing and analysis, and dissemination. There was deep cross-functional collaboration within a hut, but extreme secrecy between each of them. There was a constant need for iteration and refinement of techniques to respond to newer Enigma machines and procedures, and even though the work was conducted under an environment of constant pressure the code-breakers were encouraged to take two-week research sabbaticals to improve methods and procedures. There was also a log book for anyone to propose improvements, and potential improvements were discussed every two weeks.


Intuit's CDO talks complex AI project to improve finances


The most obvious is a chat bot, but it could also provide augmented intelligence for our customer care representative; it could provide augmented intelligence for our accountants who are working on Intuit's behalf or private accountants who are using Intuit software. It could be deployed in internal processes where product teams learn how people interact with our software through a set of focus groups. So, it's one technology that could be instantiated across many different platforms and touchpoints. That's one of the exciting aspects from a technology perspective. If you think about how a human works, there are so many things that are amazing about humans, but one is that they have the ability to rapidly change contexts and rapidly deal with a changing environment. The touchpoint doesn't matter. It doesn't matter if you're talking on video, on the phone or in person. Generally speaking, people can deal with these channels of communication very easily. But it's hard for technology to do that. Technology tends to be built for a specific channel and optimized for that channel.


Ethereum is Built for Software Developers

In part, this is all thanks to what Ethereum has accomplished in a very short period. We give too much credit to Bitcoin’s price that skyrocketed approaching $20,000 in December, 2017, but the reality is in the code, and Ethereum is now what all dApp platforms compare themselves with, not the decade old Bitcoin model. As Ethereum solves the scalability problem, it will effectively untether itself from Bitcoin’s speculative price volatility. If Bitcoin is a bet, Ethereum is a sure thing. The main reason that is is because of the developer community it has attracted and the wide range of startups that use it especially in the early phases of their development. As TRON might find out, once they go independent they may have a more difficult time attracting software developers. Ethereum must take the piggy-back of ICOs of 2017 and be the open-source public distributed world operating system it was designed to be. It has massive potential to fill and in a crypto vacuum of hype and declining prices, Ethereum is perhaps the last chance before 2020, as the Chinese blockchains take over. The window is disappearing guys.


Software Flaws: Why Is Patching So Hard?

Software Flaws: Why Is Patching So Hard?
"As OCR states, identifying all vulnerabilities in software is not an easy process, particularly for the end user or consumer," says Mac McMillan, CEO of security consultancy CynergisTek. Among the most difficult vulnerabilities to identify and patch in healthcare environments "are those associated with software or devices of a clinical nature being used directly with patients," he says. "There are many issues that make this a challenge, including operational factors like having to take the system off line or out of production long enough to address security. Hospitals don't typically stop operations because a patch comes out. The more difficult problems are ones associated with the vulnerability in the software code itself, where a patch will not work, but a rewrite is necessary. When that occurs, the consumer is usually at a disadvantage." Fricke says applications that a vendor has not bothered to keep current are the trickiest to patch. "Some vendors may require the use of outdated operating systems or web browsers because their software has not been updated to be compatible with newer versions of operating systems or web browsers," he says.


5 security strategies that can cripple an organization

Security teams today have a two-faceted information problem: siloed data and a lack of knowledge. The first issue stems from the fact that many companies are only protecting a small percentage of their applications and, therefore, have a siloed view of the attacks coming their way. Most organizations prioritize sensitive, highly critical applications at the cost of lower tier apps, but hackers are increasingly targeting the latter and exploiting them for reconnaissance and often much more. It’s amazing how exposed many companies are via relatively innocuous tier 2 and legacy applications. The second, and more significant issue, can be summarized simply as, “you don’t know what you don’t know.” IT has visibility into straightforward metrics, but it often lacks insight into the sophistication of attempted breaches, how their risk compares to peers and the broader marketplace, and other trends and key details about incoming attack traffic. With visibility to only a small percentage of the attack surface, it’s very difficult to know whether the company is being targeted and exploited. Given the resource challenges noted above, it’s unrealistic to attempt to solve this problem with manpower alone.


What's the future of server virtualization?

Prior to server virtualization, enterprises dealt with server sprawl, with underutilized compute power, with soaring energy bills, with manual processes and with general inefficiency and inflexibility in their data-center environments. Server virtualization changed all that and has been widely adopted. In fact, it’s hard to find an enterprise today that isn’t already running most of its workloads in a VM environment. But, as we know, no technology is immune to being knocked off its perch by the next big thing. In the case of server virtualization, the next big thing is going small. Server virtualization took a physical device and sliced it up, allowing multiple operating systems and multiple full-blown applications to draw on the underlying compute power.  In the next wave of computing, developers are slicing applications into smaller microservices which run in lightweight containers, and also experimenting with serverless computing (also known as function-as-a-service (FaaS). In both of these scenarios, the VM is bypassed altogether and code runs on bare metal.



Quote for the day:


"The simple things are also the most extraordinary things, and only the wise can see them." -- Paulo Coelho


Daily Tech Digest - July 10, 2018

The value of visibility in your data centre

The value of visibility in your data centre image
Keeping key enterprise applications up and running well is an absolute requirement for modern business. As estimated by Gartner, IDC and others, the cost of IT downtime averages out to around £4,200 per minute. A simple infrastructure failure might cost around £75,000; while the failure of a critical, public-facing application costs more like £378,000 to £755,000 per hour. When failures impact large-scale global logistics and cause widespread inconvenience to customers, for example, last May’s, British Airways airline operations systems failure, costs can quickly become staggering. BA estimated losing $102.19 million USD (£77.08 million GBP) in hard costs including airfare refunds to stranded passengers, plus incalculable damage to reputation. BA’s parent company, IAG, subsequently lost $224 million USD (£170 million GBP) in value, based on its then-current stock valuation. Preventing such disasters, or intervening effectively and rapidly when they occur, means giving developers and operations staff (DevOps) visibility into IT infrastructure, networks, and applications.



Entrepreneurs think differently about risk


What is the worst thing that can happen? This is where a lot of people start and it’s why they don’t even bother evaluating the rest of it. A person who hates their job and doesn’t want to work for anyone again might shrug off becoming a freelancer because of the risk involved in quitting a 9–5, losing the steady paycheque and benefits, potentially not having clients and needing to find a new job. However I like to think “then what?” Will that kill me? No, it just means that you may need to find a new job. Plenty of people get laid off and need to find new jobs, that’s not the end of the world. If you’re worried that raising money for your startup will cause you to give away too much ownership in your company and that your investors will one day take control and oust you from the company, that’s a real fear. But even then, you still would own your shares in the company. Maybe if they oust you from the company it’s because you’re doing an abysmal job as CEO and they need someone who can grow the company. You still would own a big chunk of that company.


APT Trends Report Q2 2018


We also observed some relatively quiet groups coming back with new activity. A noteworthy example is LuckyMouse (also known as APT27 and Emissary Panda), which abused ISPs in Asia for waterhole attacks on high profile websites. We wrote about LuckyMouse targeting national data centers in June. We also discovered that LuckyMouse unleashed a new wave of activity targeting Asian governmental organizations just around the time they had gathered for a summit in China. Still, the most notable activity during this quarter is the VPNFilter campaign attributed by the FBI to the Sofacy and Sandworm (Black Energy) APT groups. The campaign targeted a large array of domestic networking hardware and storage solutions. It is even able to inject malware into traffic in order to infect computers behind the infected networking device. We have provided an analysis on the EXIF to C2 mechanism used by this malware. This campaign is one of the most relevant examples we have seen of how networking hardware has become a priority for sophisticated attackers. The data provided by our colleagues at Cisco Talos indicates this campaign was at a truly global level.


Organizations must act to safeguard 'the right to be forgotten'

The immediate need is clear—the capability to delete accounts and any associated personal data. But this is not as simple as it might first appear. Organizations are loath to give up data—it helps them improve their own business models, and quite frankly, it is profitable. One only need to look at the recent reselling of user information to third parties to realize its value.Enterprises, then, would need to be compelled to part with what it perceives as valuable—and governments are attempting this with legislation such as GDPR. Beyond the necessary business case, however, lie technological challenges. While many online services have built in deletion and removal options, lingering personal data is a different matter. If this personal information is located in an application or structured database, then the process is relatively straightforward—eliminate the associated account and its data is also removed. If the sensitive data is in files—detached from applications governed by the business—then they behave like abandoned satellites orbiting the earth, forever floating in the void of network-based file shares and cloud-based storage.


Big Data Is A Huge Boost To Emerging Telecom Markets

big data telecom
Big data in telecommunications is playing the biggest role by increasing the reach of major telecommunication brands in these markets. This is especially evident in Africa, where the telecommunications market growth has been the strongest. In 2004, only 6% of African consumers owned a mobile device. This figure has grown sharply over the past 14 years. There are now over 82 million mobile users throughout the continent. In some regions in Africa, the growth has been faster than even the most ambitious technology economists could have predicted. The number of people in Nigeria that own mobile devices has been doubling every year. Pairing big data and telecom has helped spur growth in the telecommunications industry in several ways. According to an analysis by NobelCom, this will likely lead to cheaper telephone calls between consumers in various parts of the world. Here are some of the biggest. A growing number of telecommunications providers are investing more resources trying to reach consumers throughout Africa and other emerging telecom markets.


Be smart about edge computing and cloud computing

Be smart about edge computing and cloud computing
Edge computing is a handy trick. It’s the ability to place processing and data retention at a system that’s closer to the target system it’s collecting data for as well as to provide autonomous processing. The architectural advantages are plenty, including not having to transmit all the data to the back-end systems—typical in the cloud—for processing. This reduces latency and can provide better security and reliability as well. But, and this is a big “but,” edge computing systems don’t stand alone. Indeed, they work with back-end systems to collect master data and provide deeper processing. This is how edge computing and cloud computing provide a single symbiotic solution. They are not, and will never be, mutually exclusive. Some best practices are emerging around edge computing that allow enterprises to provide better use of both platforms. ... The edge computing hype will drive confusion in the next few years. To avoid that confusion, you need to understand what roles each type of system plays, and you need to understand that very few technologies take over existing technologies.


Can Cybersecurity be Entrusted with AI?

Cybersecurity
While the technology can help to fill cybersecurity skill gaps but at the same time its a powerful tool for hackers as well. In short AI can act as guard and threat at same time. What matter is who use it for what purpose. At end It all depends upon Natural Intelligence to make good or bad use of Artificial Intelligence. There are paid and free tools available which can attempt to modify malwares to bypass machine learning antivirus software. Question is how to detect and stop? Cyberattacks like phishing and ransomeware are said to be much more effective when they are powered by AI. On the other hand to power up the behavioural patterns AI in particular is extremely good at recognizing patterns and anomalies. This makes it an excellent tool for threat hunting. Will AI be the bright future of security as the sheer volume of threats is becoming very difficult to track by humans alone. May be AI might come out as the most dark era, all depends upon Natural Intelligence. Natural Intelligence is needed to develop AI/machine learning tools. Despite popular belief, these technologies cannot replace humans (in my personal opinion). Using them requires human training and oversight.


How to Adopt a New Technology: Advice from Buoyant on Utilising a Service Mesh


Adopting technology and deploying it into production requires more than a simple snap of the fingers. Making the rollout successful and making real improvements is even tougher. When you’re looking at a new technology, such as service meshes, it is important to understand that the organizational challenges you’ll face are just as important as the technology. But there are clear steps you can take in order to navigate the road to production. To get started, it is important to identify what problems a service mesh will solve for you. Remember, it isn’t just about adopting technology. Once the service mesh is in production, there need to be real benefits. This is the foundation of your road to production. Once you’ve identified the problem that will be solved, it is time to go into sales mode. No matter how little the price tag is, it requires real investment to get a service mesh into production. The investment will be required by more than just you as well. Changes impact coworkers in ways that range from learning new technology to disruption of their mission critical tasks.


How Businesses Can Navigate the Ethics of Big Data

How Businesses Can Navigate the Ethics of Big Data
The laws regarding data protection and privacy differ from country to country all across the world. The EU has an authentic set of laws pertaining to this matter, but they are visibly different than what the United States has. Privacy within the EU is often said to be stronger than what it is in the U.S. Although the myths may exaggerate the difference, the EU is miles ahead of the U.S. when it comes to stringent data and privacy protection. Privacy is considered a fundamental right for all individuals living in the EU. Details about privacy and data protection are discussed as much as gun control in the U.S. The U.S. does have privacy protection problems, but the crux of the matter is that these laws are separate for both the governing bodies.  The diversity in laws concerned with data protection in numerous countries puts forward the notion that there is a need for globally-accepted norms that govern how privacy and protection are provided to users and their data. The globally accepted norms will set the standards and a pathway for others to follow when it comes to data protection.


Selling tech initiatives to the board: Eight success tips for IT leaders

Too many IT leaders, especially if they are busy running multiple projects, underestimate how much time it takes to put together a really persuasive presentation. Some of the most compelling presentations are those built around a demo of the technology being discussed or those with a strong video presentation that draws the audience into the topic. However, demos and videos aren't going to help if you don't have a clear and cogent message for board members who are charged with ensuring that the company is well run, is making the right kinds of investments, and is positioning itself for the future. If what you present doesn't check all of these boxes, it won't succeed. ... It's easy for a technology leader to get mired in tech talk and lose an audience. The board already knows that you know tech. What it wants to know is how well you understand the business and how tech can advance it. The best way to show them that you're focused on the business is to present a clear message in plain English and to avoid technology buzzwords and levels of detail that are extraneous to the business decision that has to be made.



Quote for the day:


"Growth happens when you fail and own it, not until. Everyone who blames stays the same." -- Dan Rockwell