Daily Tech Digest - October 02, 2018

SIE Europe
SIE Europe is co-founded by three international Internet luminaries: Dr. Paul Vixie, Chairman and CEO of Farsight Security, Christoph Fischer, CEO of BFK edv-consulting GmbH and Peter Kruse, co-founder of CSIS Security Group A/S. “We founded SIE Europe to build a European-based community of Internet defenders who want to make the Internet safer for all users. As part of this initiative, SIE Europe will provide the infrastructure to collect, aggregate and share real-time DNS data in strict compliance with the privacy laws and regulations of the European Union, including General Data Protection Regulations (GDPR),” said Dr. Paul Vixie, Chairman and CEO of Farsight Security. All online transactions, good or bad, begin with the DNS. By providing visibility to the IP addresses, domain names and other digital artifacts of the DNS used by threat actors, security professionals will be able to accurately identify and map criminal infrastructures in their networks and take preventive measures to protect their networks from future cybercrime activity.



Facebook could face up to $1.6bn fine for data breach


Facebook said the attack exploited the “complex interaction of multiple issues in our code” and stemmed from a change made to the video uploading feature in July 2017. In response, Facebook said it had fixed the vulnerability, informed law enforcement and reset the access tokens of the almost 50 million accounts known to be affected. “We’re also taking the precautionary step of resetting access tokens for another 40 million accounts that have been subject to a “View As” look-up in the last year. As a result, around 90 million people will now have to log back in to Facebook, or any of their apps that use Facebook Login,” said Facebook. The company has also turned off the “View As” feature while it conducts a security review, but admitted it has yet to determine whether accounts were misused or any information accessed. Facebook said it is also still trying to establish the location and identity of the attackers and will reset the access tokens of any other accounts it believes may have been affected.


The CTO role: ‘It’s about planning and business opportunities’

Every CTO role is different, and in this case, Hanson, focuses on the sales side of the business, whereas other CTOs are more concerned with the development of products. “We have some very intelligent people in our product management division who look after the actual development of products. So I’m not on the product side. I’m more on the sales side,” confirms Hanson. His responsibility centres around making sure he finds out how Informatica’s prospects and consumers use the company’s technology. He needs to understand their challenges, governance and compliance issues moving forward; well as the pressures in their marketplace and how they need to leverage data to be successful and competitive in the marketplace. “It’s really my job to try and collect that information, and think about innovative uses for our products as they currently exist, and what type of initiatives we should try and help our prospects and customers with,” explains Hanson.


Big Data: changing the future of business models

null
The ability to analyse and make informed decisions from the use of data and its analytical capabilities is vital if a business is to succeed. In an increasingly competitive industry, it is imperative that firms are able to make quick and increasingly complex decisions to cater for the changing demands from customers and evolving market conditions.  By harnessing data, businesses can identify new opportunities within their existing business operations, create more efficient operations, increase profitability and improve customer service. By embracing data, businesses can gain a competitive edge over their rivals, ensuring they don’t lag behind the competition. Over the years, our data team has worked alongside businesses to help them find data-driven solutions and technologies with the aim of fast tracking their objectives and stimulating growth.


How I Lost My Faith in Private Blockchains

The business and legal worlds operate from an aspect of centralized entities, and while that remains the case, any forced attempts at decentralization are likely to come short. While it is possible that in the future we may see decentralized businesses, they are far more likely to come from the public blockchain world where they are able to grow organically in an entirely new paradigm. In the meantime, institutions and individuals should be evaluating permissioned blockchains like any other technology: it isn't magic, and it should be assessed like one would assess any other. The benefits of a technology should never be assumed based on buzzwords, hype or fear that "everyone else is doing it so why shouldn't I?" Instead, benefits should be assessed by asking what is the business problem, what are the different technology options available, and what are the quantifiable costs and benefits of each.


LinkedIn the latest to introduce its own server designs

LinkedIn the latest to introduce its own server designs
The idea behind the designs is to reduce the amount of work it takes to deploy servers in a data center. Again, this seems to assume people will build their own the way LinkedIn and other hyperscalers do it. It’s all designed to be like building with Lego bricks. LinkedIn also wanted to standardize hardware across both primary and edge data centers, which is likely why Vapor IO is involved. Edge locations don’t have a readily available technician, so if a company sends a technician to an edge container, the last thing it wants to do is make the tech waste time trying to figure out the layout of the equipment. By having common hardware between the two, the technician will work with familiar gear. LinkedIn claims these designs will mean being able to build infrastructure for 1 percent of the cost and six to ten times faster integration time, with greater power efficiency and other cost savings. However, it does not address the issue of IT staff building the hardware. LinkedIn, Google, Facebook, etc., can afford to hire engineers who build servers all day. Your average IT shop does not.


This is how cyber attackers stole £2.26m from Tesco Bank customers

The attackers most likely used an algorithm which generated authentic Tesco Bank debit card numbers and, using those virtual cards, they attempted to make thousands of unauthorised debit card transactions. The FCA said Tesco Bank's failures include the way in which the bank distributed debit card numbers and mistakes made in the reaction to the attack which meant that no action was taken for almost a day after the incident was first uncovered. A number of deficiencies in the way Tesco Bank handled security left customers vulnerable to cyber attackers in an incident that was "largely avoidable", said the FCA analysis of the incident which Tesco Bank had to this point been tight-lipped about -- to the frustration of other financial institutions. Poor design of Tesco Bank debit cards played a significant role in creating security vulnerabilities that led to thousands of customers having their accounts emptied. One of these involved the PAN numbers -- the 16-digit card number sequence used to identify all debit cards.


Google Chrome 70 is coming. Are your security certificates in order?

Google Chrome 70 is coming. Are your security certificates in order?
For those unfamiliar with the details of this, in 2017 Google and Mozilla decided to deprecate all Symantec-issued digital certificates based on their assessment that Symantec did not correctly validate its SSL certificates prior to issuing them to customers. Google and Mozilla then decided to put in place a multi-step plan to distrust any certificates issued from the Symantec PKI. This plan phased out Symantec certificates over the next year and a half. Instead of following the Google plan, Symantec elected to sell its certificate business to DigiCert. Despite the transaction, the requirement to replace all certificates issued from the Symantec PKI remained intact, requiring millions of certificates to be replaced during 2018. To assist customers in replacing their certificates, DigiCert contacted each certificate holder, offering free replacement certificates chained to the trusted DigiCert roots. The first major distrust date was on December 1, 2017, when no additional TLS certificates could be issued through the Symantec PKI. Prior to that date, DigiCert cut over all issuance processes to its PKI and validation systems.


Open Compute Project eyes European enterprise adoption with Experience Centre opening


The OCP’s championing of 21-inch server rack designs is often cited as a partial barrier to enterprise adoption of its technologies, as it makes it potentially harder for users to deploy the technology in existing datacentres where smaller server racks are consistently the norm. The centre’s opening is being overseen by datacentre infrastructure manufacturer Rittal and OCP supplier and service provider Circle B, in conjunction with Switch Datacenters, who is in the midst of building a datacentre based on OCP principles. “The three companies have determined that in the technology sector, IT managers at large enterprises and governments in the ... “These principles form the basis on which many hyperscalers operate. By adopting OCP designs in their datacentres large enterprises and governments can benefit from the same advantages as the hyperscalers: cost reductions, lower energy usage and much more flexibility.”


Building Agile Data Lakes with Robust Ingestion and Transformation Frameworks – Part 1


With the advent of Big Data technologies like Hadoop, there has been a major disruption in the information management industry. The excitement around it is not only about the three Vs – volume, velocity and variety – of data but also the ability to provide a single platform to serve all data needs across an organization. This single platform is called the Data Lake. The goal of a data lake initiative is to ingest data from all known systems within an enterprise and store it in this central platform to meet enterprise-wide analytical needs. However, a few years back Gartner warned that a large percentage of data lake initiatives have failed or will fail - becoming more of a data swamp than a data lake. How do we prevent this? We have teamed up with one of our partners, Clarity Insights, to discuss the data challenges enterprises face, what caused data lakes to become swamps, discuss the characteristics of a robust data ingestion framework and how it can help make the data lake more agile.



Quote for the day:


"One measure of leadership is the caliber of people who choose to follow you." -- Dennis A. Peer


Daily Tech Digest - October 01, 2018

Drone defense -- powered by IoT -- is now a thing

Drone defense -- powered by IoT -- is now a thing
What exactly constitutes a “malicious” drone isn’t entirely clear, but it could range from teenagers using a small drone to peek over a fence to the kinds of military drones that are being used as weapons in several areas around the world. And, in fact, the companies cite “military bases, venues, cities, enterprises, correctional facilities, and more” as potential customers. Further, DroneTracker, Dedrone’s “airspace security platform” is designed to detects a wide variety drones, the company notes, “including commercial, consumer and military-grade, as well as autonomous drones.” Dedrone leverages IoT sensor data to detect, classify, mitigate, and localize drone-based threats, the company says, while AT&T provides the LTE connectivity. Once a drone threat is detected, Dedrone notifies security personnel. ... But is the threat really that great for most business and industrial applications? I’m still not sure that knowing that AT&T and Dedrone — along with many others, I’m sure — are on the case makes me feel safer or even more vulnerable.



Understanding Risks to Data Drives Controls Efficiencies

Business professionals and IT practitioners agree that data are a valuable commodity for enterprises in many ways. The notion of using data to help monitor and manage risk tolerances in audit and assurance activities often is overlooked. Data should be considered and analyzed as the enterprise selects, plans and deploys controls, and should also be part of enterprise evaluation of the performance of those controls. This recently was highlighted by ISACA, which has put forth new guidance in partnership with SecurityScorecard titled Continuous Assurance Using Data Threat Modeling. In collaboration with industry experts, practitioners and ISACA subject matter experts, the guidance provides an excellent overview on how to adapt threat modeling to data in transit and data at rest as a strategy to put forth a more holistic, comprehensive and continuous model for understanding data risk and for analyzing potential risk in the supply chain.


Network security challenges remain a top concern for IT pros


More than one-third of respondents ranked network security challenges as their top concern when planning, deploying and managing enterprise networks. As mobile devices continue to expand and redefine the network edge, network security challenges remain a top issue, the study found. Additionally, 83% of respondents identified several types of network and telecom fraud as serious issues. More than half of IT pros cited identity fraud as a primary concern in relation to real-time communications. The expansion of communications channels -- such as voice, email, video, chat and in-app communications -- also affects network complexity, deployment, management and what defines the network edge.  To combat network security challenges and improve control, IT pros said they would consider emerging technologies, such as biometrics, artificial intelligence and blockchain. The survey, dubbed Enterprise Networks in Transition: Taming the Chaos, also highlighted software-defined WAN as a technology that could help enterprise networks evolve. Yet, according to the survey, North America lags other regions in software-defined networking deployments.


Is Blockchain a Universal Platform?

In order to be a responsible prosumer, with a micro-grid dedicated to full renewable energy use, blockchain allows you to monitor the exchange of energy between the point of creation – via a solar panel, for example – to the consumption of it through not just your home, but another prosumer’s home. Within this digital ledger, energy use can be monitored and maintained in such a way that community members are actively engaged with their utilities in a way that benefits a community as a whole. It would be completely ridiculous to suggest that the insurance industry is an emerging market – in fact, it is the largest market in the world with staggering 1.2 trillion dollars in revenue. Despite this position the market it is in, insurance is caught in a slog deeply rooted in traditional practices. Blockchain can be used to create sub-markets within the industry: Peer-to-peer insurance, which cuts out the middlemen and provides greater portions of premiums to the policy holder; Parametric insurance, which uses a smart contract to automatically pay twenty percent of any type of claim


Ransomware Crypto-Locks Port of San Diego IT Systems

Ransomware Crypto-Locks Port of San Diego IT Systems
The attacker or group of hackers behind the attempted shakedown has also demanded a ransom, payable in bitcoin, in exchange for the promise of a decryption key, port officials say. The port says that while IT systems have been disrupted, much of the port's business continues without interruption. "It is important to note that this is mainly an administrative issue and normal port operations are continuing as usual," says Port of San Diego CEO Randa Coniglio in a statement. "The port remains open, public safety operations are ongoing, and ships and boats continue to access the bay without impacts from the cybersecurity incident." The Port of San Diego - spanning the cities of Chula Vista, Coronado, Imperial Beach, National City and San Diego along the 34 miles of the San Diego Bay - is the fourth largest of California's 11 ports. It includes two maritime cargo terminals, two cruise ship terminals, 22 public parks, the Harbor Police Department and leases for hundreds of businesses, including 17 hotels, 74 restaurants and three retail centers, plus museums and bay tours.


The Right Diagnosis: A Cybersecurity Perspective

No security program is perfect, but some need more attention than others. What are the checkpoints that will help organizations understand where their security programs are ailing, how to make the right diagnosis, and begin the proper treatment? ... Just as the brain controls how the body functions, the leadership of a security organization controls how that organization functions. When looking to evaluate and understand where a security program stands, one of the first diagnostics should be focused on leadership. Do security leaders have a clear vision? Do they have a solid strategy? Are they focused on the right goals and priorities? Do they have the right plan to make their strategy a reality? Do they have the ear of the executives, the board, and other stakeholders? Are they building the right team? ... Security operations could be considered the central function of a security program, analogous to its heartbeat. Just as a healthy, regular heartbeat is critical to the health of the body, a healthy security operations program is critical to the health of a security organization. Is the security operations team properly trained?


Why 5G will disappoint everyone

5g smart city iot wireless silver platter tablet service
The wireless carriers hope 5G will enable them to compete with or replace ISPs, cable companies, and satellite internet and TV companies. So that’s nice. But it will probably be more than 15 years before 5G replaces 4G for most users most of the time. 5G won’t be reliable enough anytime soon for companies such as Apple and Samsung to remove the supercomputer-like processing power from smartphones and move everything to the cloud. I’m afraid that $1,000-plus smartphones are here to stay.And because of the way 5G works, rollouts will soon face another huge hurdle. ... The technology comes with a requirement that towers be far greater in number and far closer to users. Some residents in North Potomac say more than 60 5G wireless towers have been installed less than 30 feet from their front doors. It’s possible that definitive, widely accepted proof may emerge that clearly shows a health risk from 5G wireless equipment. It’s likely that debate over the health effects will continue. But it’s not even remotely conceivable that everybody will agree that 5G is harmless.


How the 'human and machine' model will transform customer service

The good news is that automated interactions will only become more tailored and efficient in the future as more businesses turn to AI in order to understand conversations in any language, automate repetitive processes and solve customer problems faster than the competition. Ultimately, it’s not a matter of AI and automation replacing customer support agents but rather enabling them to become ‘super agents’. And, the advantages that these ‘super agents’ pose for business growth are undeniable - from visibility (AI knows everything your users are doing, your customer support team does not), to sheer productivity (AI doesn’t need to sleep, eat or take time off). Inevitably, the decision to add automation to the customer service mix requires smart decisions and a solid understanding as to where and how automation can achieve cost savings while always fostering better, more personalized customer experiences.


Digital transformation in 2019: The big insights and trends

Digital Transformation Trends, Lessons Learned, and Best Practices for 2019
On average, most organizations believe that half of their revenue will come from digital channels by 2020. Furthermore, the World Economic Forum estimates that the overall economic value of digital transformation to business and society will top $100 trillion by 2025. Other similar data are easy to find. These represent vital macroeconomic trends that are the most significant attainable new business potential for the typical enterprise. Any way you look at it, the largest growth opportunities that most organizations can access now is to better seize the white space in these rapidly expanding digital markets. The latest trends in digital transformation for next year reflect some particularly hard won lessons from the past few years, on both the business and technology sides. It's worthwhile taking the time to understand how these insights came about, as organizations earlier in the journey can avoid making many of the same painful, expensive, and time-consuming realizations along the way. As they say, one useful definition of 'smart' is not making all the mistakes oneself.


The Future of Brain Science

Fortunately for neuroscience, mathematicians, data scientists, and computer scientists have been wrestling with their own “information overload” challenges, coping with exponential increases in the volume, variety, and velocity of digital data spawned by Moore’s law revolution in digital technology. Google, for instance, ingests unimaginable volumes of data every second, that they must somehow “monetize” (make money from, because their services are largely “free”) by precisely targeting digital advertisements to people who use Google search or Gmail. Google can only do this with the aid of massive cloud computing systems running complex math and AI algorithms that quickly recognize patterns and act upon these insights to serve up ads in real time. One branch of AI, called “cognitive computing,” holds particular promise for extending Nicolelis work to humans. Cognitive computing, which goes well beyond simple pattern recognition, achieves deep understanding of the underlying causes of complex patterns, instead of simple recognition that patterns exist.



Quote for the day:


"Never doubt that a small group of thoughtful, committed citizens can change the world. Indeed, it is the only thing that ever has." -- Margaret Mead


Daily Tech Digest - September 30, 2018

How to successfully implement an AI system

null
Companies should calculate the anticipated cost savings that would be gained with a successful AI deployment, using that as a starting point for investment so that costs of errors or short falls on expectations are minimised if they occur. The cost savings should be based on efficiency gains, as well as the increased productivity that can be harnessed in other areas of the business by freeing up staff from administration tasks. This ensures companies do not over-invest at the beginning before seeing initial results and if changes are necessary they do not cannibalise potential ROI and companies can still potentially switch to other viable alternative use cases.  Before advising companies on what solution they should invest in, it's important to first establish what they want to achieve. Digital colleagues can provide a far superior level of customer service however, they require greater resource to set up.  Most chatbots are not scalable, once deployed they cannot be integrated into other business areas as they are designed to answer FAQs based on a static set of rules. Unlike digital colleagues, they cannot understand complex questions or perform several tasks at once.


How adidas is Creating a Digital Experience That's Premium, Connected, and Personalized

Take something like a product description. How do we really have the product descriptions and offerings so that if you're interested in sports we will help you find exactly the product that you need for the sport that you're interested in? We will also educate you and bring you back at different points in time to help you find out what you need when you need it, or with an engagement program. Ultimately, like the membership program, that it has something that's sticky, that you can give back to something, even more, you can participate in events and experiences. For us, a lot of it’s really deepening those experiences but also exploring new technologies and new areas. Omnichannel was kind of the original wave which happened and I said it was the freight train that came past us a couple of years ago. Now we're also looking at what those next freight trains are, whether it's technologies like blockchain or experiencing picking up a new channel. For example, we're working extensively with Salesforce on automation, how we can automate consumer experiences.


What Deep Learning Can Offer to Businesses


With the capabilities of artificial intelligence, the way the words are processed and interpreted can be changed dramatically. It turns out we can define the meaning of the word based on its position in the text without the need of using a dictionary. ... One of the most recent successful appliances of deep learning for image recognition came from Large Scale Visual Recognition Challenge, when Alex Krizhevsky applied convolutional neural networks to organize images from ImageNet, a dataset containing 1.2 million pictures, into 1,000 different classes. In 2012, Krizhevsky’s network, AlexNet, achieved a top-5 test error rate of 15.3%, outperforming traditional computer vision solutions with more than 10% accuracy. The experience of Alex Krizhevsky changed the landscape of the data science and artificial intelligence field from the perspective of the research and business application. In 2012, AlexNet was the only deep learning model at ILSVRC (ImageNet Large Scale Visual Recognition Competition). Two years later, in 2014, there were no conventional computer vision solutions among the winners.



Can Global Semantic Context Improve Neural Language Models?

Global co-occurrence count methods like LSM lead to word representations that can be considered genuine semantic embeddings, because they expose statistical information that captures semantic concepts conveyed within entire documents. In contrast, typical prediction-based solutions using neural networks only encapsulate semantic relationships to the extent that they manifest themselves within a local window centered around each word (which is all that’s used in the prediction). Thus, the embeddings that result from such solutions have inherently limited expressive power when it comes to global semantic information. Despite this limitation, researchers are increasingly adopting neural network-based embeddings. Continuous bag-of-words and skip-gram (linear) models, in particular, are popular because of their ability to convey word analogies of the type “king is to queen as man is to woman.”


Big Data and Machine Learning Won’t Save Us from Another Financial Crisis


Machine learning can be very effective at short-term prediction, using the data and markets we have encountered. But machine learning is not so good at inference, learning from data about underlying science and market mechanisms. Our understanding of markets is still incomplete. And big data itself may not help, as my Harvard colleague Xiao-Li Meng has recently shown in “Statistical Paradises and Paradoxes in Big Data.” Suppose we want to estimate a property of a large population, for example, the percentage of Trump voters in the U.S. in November 2016. How well we can do this depends on three quantities: the amount of data (the more the better); the variability of the property of interest (if everyone is a Trump voter, the problem is easy); and the quality of the data. Data quality depends on the correlation between the voting intention of a person and whether that person is included in the dataset. If Trump voters are less likely to be included, for example, that may bias the analysis.


Spending on cognitive and AI systems to reach $77.6 billion in 2022

Banking and retail will be the two industries making the largest investments in cognitive/AI systems in 2018 with each industry expected to spend more than $4.0 billion this year. Banking will devote more than half of its spending to automated threat intelligence and prevention systems and fraud analysis and investigation while retail will focus on automated customer service agents and expert shopping advisors & product recommendations. Beyond banking and retail, discrete manufacturing, healthcare providers, and process manufacturing will also make considerable investments in cognitive/AI systems this year. The industries that are expected to experience the fastest growth on cognitive/AI spending are personal and consumer services (44.5% CAGR) and federal/central government (43.5% CAGR). Retail will move into the top position by the end of the forecast with a five-year CAGR of 40.7%. On a geographic basis, the United States will deliver more than 60% of all spending on cognitive/AI systems throughout the forecast, led by the retail and banking industries.


5 ways industrial AI is revolutionizing manufacturing

artificial intelligence / machine learning / network
In manufacturing, ongoing maintenance of production line machinery and equipment represents a major expense, having a crucial impact on the bottom line of any asset-reliant production operation. Moreover, studies show that unplanned downtime costs manufacturers an estimated $50 billion annually, and that asset failure is the cause of 42 percent of this unplanned downtime. For this reason, predictive maintenance has become a must-have solution for manufacturers who have much to gain from being able to predict the next failure of a part, machine or system. Predictive maintenance uses advanced AI algorithms in the form of machine learning and artificial neural networks to formulate predictions regarding asset malfunction. This allows for drastic reductions in costly unplanned downtime, as well as for extending the Remaining Useful Life (RUL) of production machines and equipment. In cases where maintenance is unavoidable, technicians are briefed ahead of time on which components need inspection and which tools and methods to use, resulting in very focused repairs that are scheduled in advance.


Data Centers Must Move from Reducing Energy to Controlling Water

While it is a positive development that overall energy for data centers is being reduced around the globe, a key component that has — for the most part — been washed over is water usage. One example of this is the continued use of open-cell towers. They take advantage of evaporative cooling to cool the air with water before it goes into the data center. And while this solution reduces energy, the water usage is very high. Raising the issue of water reduction is the first step in creating ways our industry can do something about it. As we experience the continued deluge of the “Internet of Things”—projected to exceed 20 billion devices by 2020, we will only be able to ride this wave if we keep energy low and start reducing water usage. The first question becomes how can cooling systems reject heat more efficiently? Let’s say heat is coming off the server at 100 degrees Fahrenheit. The idea is to efficiently capture heat and bring it to the atmosphere as close to that temperature as possible — but it is all dependent on the absorption system.


AI and Automation to Have Far Greater Effect on Human Jobs by 2022

AI and Automation to Have Far Greater Effect on Human Jobs by 2022 (Infographic)
With the domination of automation in a business framework, the workforce can be extended to new productivity-enhancing roles. More than a quarter of surveyed businesses expect automation to lead to the creation of new roles in their enterprise. Apart from allotting contractors more task-specialized work, businesses plan to engage workers in a more flexible manner, utilizing remote staffing beyond physical offices and decentralization of operations. Among all, AI adoption has taken the lead in terms of automation for the reduction of time and investment in end-to-end processes. “Currently, AI is the most rapidly growing technology and will for sure create a new era of the modern world. It is the next revolution- relieving humans not only from physical work but also mental efforts and simplifies tasks extensively,” opined Kuppa. While human-performed tasks dominate today’s work environment, the frontier is expected to change in the coming years.


Modeling Uncertainty With Reactive DDD

Reactive is a big thing these days, and I'll explain later why it's gaining a lot of traction. What I think is really interesting is that the way DDD was used or implemented, say back in 2003, is quite different from the way that we use DDD today. If you've read my red book, Implementing Domain-Driven Design, you're probably familiar with the fact that the bounded contexts that I model in the book are separate processes, with separate deployments. Whereas, in Evan's blue book, bounded contexts were separated logically, but sometimes deployed in the same deployment unit, perhaps in a web server or an application server. In our modern day use of DDD, I’m seeing more people adopting DDD because it aligns with having separate deployments, such as in microservices. One thing to keep clear is that the essence of Domain-Driven Design is really still what it always was -- It's modeling a ubiquitous language in a bounded context. So, what is a bounded context? Basically, the idea behind bounded context is to put a clear delineation between one model and another model.



Quote for the day:


"A company is like a ship. Everyone ought to be prepared to take the helm." -- Morris Wilks


Daily Tech Digest - September 29, 2018

Optimizing Multi-Cloud, Cross-DC Web Apps and Sites

Latency, payload, caching and rendering are the key measures when evaluating website performance. Each round trip is subject to the connection latency. From the time the webpage is requested by the user to the time the resources on that webpage are downloaded in the browser is directly related to the weight of the page and its resources. The larger the total content size, the more time it will take to download everything needed for a page to become functional for the user. Using caching and default caching headers may reduce the latency since less content is downloaded and it may result in fewer round trips to fetch the resources, although sometimes round trips may be to validate that the content in the cache is not stale. Browsers need to render the HTML page and resources served to them. Client-side work may cause poor rendering at the browser and a degraded user experience, for example, some blocking calls (say 3rd party ads) or improper rendering of page resources can delay page load time and impact a user experience.


Lessons from the UK Government's Digital Transformation Journey

So many lessons! Some of my colleagues set out to document the higher level lessons. The result was an entire book -- Digital Transformation at Scale: Why the Strategy Is Delivery -- but there’s a huge amount more that couldn’t be included there. But top of the list is the importance of remaining focused on your purpose and your users’ needs. As technologists and agilists we can too easily be drawn into improving technology or simplifying processes without stepping back and asking why we have those things in the first place, or if the change we’re making is the right one. I’ve talked to a lot of teams in large organisations who have taken all the right steps in moving to agile but are still having trouble motivating their teams, and the missing piece is almost always being exposed directly to your users. Whether they’re end customers, or internal users, there’s nothing like seeing people use your products to motivate the team to make them better.


MissingLink.ai has launched this week to streamline and automate the entire deep learning life cycle for data scientists and engineers. “Work on MissingLink began in 2016, when my colleagues Shay Erlichmen [CTO], Rahav Lussato [lead developer], and I set out to solve a problem we experienced as software engineers. While working on deep learning projects at our previous company, we realized we were spending too much time managing the sheer volume of data we were collecting and analyzing, and too little time learning from it,” Yosi Taguri, CEO of MissingLink, wrote in a post. “We also realized we weren’t alone. As engineers, we knew there must be a more efficient solution, so we decided to build it. Around that time, we were joined by Joe Salomon [VP of product], and MissingLink was born.” The team decided to focus on machine learning and deep learning because of the potential to “impact our lives in found ways.” Machine learning has already been used for detecting diseases, in autonomous vehicles and in public safety situations, according to the company.


Big data architecture: Navigating the complexity

britusistock-938449134.jpg
First, there are the many different engines you might choose to run with your big data. You could choose Splunk to analyze log files, or Hadoop for large file batch processing, or Spark for data stream processing. Each of these specialized big data engines requires its own data universe, and ultimately, the data from these universes must come together—which is where the DBA is called in to do the stitching. But that's not all. Organizations are now mixing and matching on-premise and cloud-based big data-processing and data storage. In may cases, they are using multiple cloud vendors as well. Once again, data and intelligence from these various repositories must be blended together at some point, as the business requires. "This is a system integration problem that vendors need to help their clients solve," said Anoop Dawar, SVP of product management and marketing for MapR, a converged data platform for big data . "You have to not only be able to provide a platform for all of the different big data processing engines and data stores that are out there, but you must also be able to rapidly provide access to new big data processing engines and data stores as they emerge."


Key Difference Between The Cloud And The Data Center

Whilst the purpose of both is the same; storage, management, and maintenance of data – there is an evident architectural difference between both. So the first key difference is that a data center is land-based, in-house and has a physical setup with a physical presence of IT professionals working together as a team. On the other hand, a cloud is more like a virtual, physically non-existent store that is dependent on the internet and is accessible only by the user over the internet. There is a notable difference between the security that both offer. Of course, understandably cloud computing is less secure than data centers as the latter is an in-house setup and is liable to protect your security. On the contrary, cloud computing is internet-based which puts you at an increased risk of data leak and privacy invasion threats. Moreover, you are responsible for your own security with cloud computing because the third-party operator of the cloud is not liable for your data.


5 Easy Ways To Determine If Your Company Needs Blockchain


The purest form of blockchain is in tracking and authenticating a digital asset (music, movies, digital wallets, education certifications, mortgage contracts, and so on) with digital transactions logged against it. Blockchains can also track and authenticate physical assets (gold, organic food, artwork, manufactured parts, and such), though those assets can require checkpoints considered “off-chain.” In such cases, you’ll need trusted sources in your business network to audit and authenticate the physical asset, which can be tricky. Consider a notorious example from the aerospace industry. Some argue that well before the Challenger space shuttle disaster in 1986, some parties knew that the spacecraft’s O-ring seals contained a flaw, but this design and manufacturing problem wasn’t addressed properly. What if an aerospace industry blockchain was tracking the origin, specification, materials, and testing of that part and any known problems? Only once the integrity of that part and required tests had been confirmed by many trusted participants could the part be used.


Axon Conference Panel: Why Should We Use Microservices?

For Schrijver, it’s all about scalability. In terms of teams it’s the ability to work with multiple teams on one product. In terms of operations it’s the ability to independently scale different parts of a system. He thinks that if you build a microservices system the right way you can have almost unlimited horizontal scalability. Buijze pointed out that technically it doesn’t matter whether we work with monoliths or microservices; in theory you can scale out a monolith just as well as microservices. What microservices gives us is a strong and explicit boundary to every service. Although the architects draw limits for communication between components, we as developers are good at ignoring them. If it’s technically possible to directly communicate with another component we will do that, ignoring any rules the architects have set up. Keeping those boundaries intact is much easier when they are explicit and even more so if a component is managed by another team.


The rise of open source use across state and local government

GSA opens digital communities for AI and virtual reality
A simple solution for agencies looking to defend against open source vulnerabilities is to turn to enterprise open source providers. Enterprise-ready solutions undergo scrutinizing tests to ensure that any defect is detected, prevented, or addressed in a timely manner, thereby mitigating an agency’s risk. Even further, enterprise solutions protect government networks from these risks throughout the product lifecycle by ensuring the code is up-to-date, secure, and functioning as expected. Investing in future-oriented, enterprise open source solutions can also help lower the total cost of ownership. This is possible because agencies can sidestep the costly and painful vendor lock-in that comes with proprietary software. Instead, enterprise open source enables users to utilize software that is platform agnostic and enables the agency to make the hardware, operating system, and environment decisions that are optimal for their requirements and mission. At the end of the day, an enterprise open source solution provides government users with the best of both worlds.


Crowdstrike CTO on securing the endpoint and responding to a breach

The first was that a modern security platform had to be built as a native-cloud solution. The cloud was critical not just for ease of management and rapid agent rollouts, but also for protection of off-premise assets and workloads deployed in public and hybrid clouds. The cloud would also be used to dramatically reduce performance impact that an endpoint agent would have on a system as heavy processing work would be offloaded to an elastically scalable cloud compute. Finally, the cloud could leverage the power of crowdsourcing – collection of trillions of security-related events from endpoint agents deployed all over the world to learn from every adversary action and taking away their ability to reuse tradecraft as they launch attacks against new victims. The second principle was to leverage machine learning/artificial intelligence to predictively identify new threats by training algorithms on the largest dataset in the security industry – over a trillion events collected every single week by CrowdStrike Falcon agents protecting organisations in 176 countries.


What is Blockchain Technology? A Step-by-Step Guide For Beginners

What is Blockchain Technology?
Information held on a blockchain exists as a shared — and continually reconciled — database. This is a way of using the network that has obvious benefits. The blockchain database isn’t stored in any single location, meaning the records it keeps are truly public and easily verifiable. No centralized version of this information exists for a hacker to corrupt. Hosted by millions of computers simultaneously, its data is accessible to anyone on the internet. ... As revolutionary as it sounds, Blockchain truly is a mechanism to bring everyone to the highest degree of accountability. No more missed transactions, human or machine errors, or even an exchange that was not done with the consent of the parties involved. Above anything else, the most critical area where Blockchain helps is to guarantee the validity of a transaction by recording it not only on a main register but a connected distributed system of registers, all of which are connected through a secure validation mechanism.



Quote for the day:


"To have long term success as a coach or in any position of leadership, you have to be obsessed in some way." -- Pat Riley


Daily Tech Digest - September 28, 2018

7 Most Prevalent Phishing Subject Lines

(Image: Amy Walters - stock.adobe.com)
People are curious and they want to help, he continues, and it's these two qualities that make them susceptible to phishing attacks. When they do fall for scams, most employees are quick to realize it. "I'm really busy," "I missed that," "I should've caught that email," are all commonly heard phrases from victims who have opened malicious emails and realized they did wrong. "No matter how much technology you put in place to block them, stuff always gets through," Hayslip adds. Webroot recently scanned thousands of phishing emails from the past 18 months to learn more about the trends around common subject lines designed to trick targets. Hayslip presented the findings to about 100 fellow CISOs around the country and learned "almost everybody's seeing the same thing," he says. Financially related messages and notions of urgency are commonly seen in phishing emails, albeit under different subject lines. John "Lex" Robinson, cybersecurity strategist at Cofense echoes Hayslip's sentiments and says attackers are getting better and better at understanding the context of the emails they're sending and who they're targeting.


Agile IT held back by legacy tech and legacy budgeting


“Re-architecting and integrating applications is difficult work, and for many CIOs, this barrier is best overcome by seeking outside help and bringing in skilled application remediation experts from a third party,” the report said. A big majority of organisations (87%) say legacy applications are slowing their journey to creating an agile workspace, with the main causes cited as cost of re-architecting or transforming applications (68%), disruption to the user experience (43%), and a lack of in-house skills to modernise applications (36%). Evolving alongside this application challenge has been the shift towards cloud computing, with organisations looking to software-as-a-service (SaaS) applications to increase workspace agility. However, only 25% of organisations think SaaS applications meet their requirements, and this figure drops to 17% in mid-size organisations. Overall, 84% of organisations say an inability to roll out new services and applications to their workforce quickly is affecting business competitiveness.


Blockchain Applications That Are Transforming Society

Primitive forms of smart property exist. Your car-key, for instance, may be outfitted with an immobilizer, where the car can only be activated once you tap the right protocol on the key. Your smartphone too will only function once you type in the right PIN code. Both work on cryptography to protect your ownership. The problem with primitive forms of smart property is that the key is usually held in a physical container, such as the car key or SIM card, and can’t be easily transferred or copied. The blockchain ledger solves this problem by allowing blockchain miners to replace and replicate a lost protocol. ... Any material object is a ‘thing.’ It becomes an internet of things (IoT) when it has an on/ off switch that connects it to the internet and to each other. By being connected to a computer network, the object, such as a car, become more than just an object. It is now people-people, people-things, and things-things. The analyst firm Gartner says that by 2020 there will be over 26 billion connected devices. Others raise that number to over 100!


Quantum Computing: Why You Should Pay Attention


Typical computers rely on bits, which are represented by ones and zeros. Using just these two numbers, our computers can solve any arithmetic questions and have excellent logic capabilities. Quantum computers, on the other hand, replace bits with quantum bits, or qubits. Unlike their binary counterparts, qubits can exist as both ones and zeros at the same time, in a so-called superposition. This isn’t an analogy: According to the most common interpretation of quantum mechanics, qubits are actually ones and zero simultaneously. With this capability, qubits are able to solve certain problems that are computationally expensive using binary arithmetic and logic in far fewer steps, and some problems can be solved with just a single step. Although the very concept of quantum computing sounds outlandish, devices are being developed by tech giants including Intel and Google, and Microsoft is already unveiling toolkits for developing software for quantum computers.


What Is Reinforcement Learning - A Simple Explanation & Practical Examples


Similar to toddlers learning how to walk who adjust actions based on the outcomes they experience such as taking a smaller step if the previous broad step made them fall, machines and software agents use reinforcement learning algorithms to determine the ideal behavior based upon feedback from the environment. It’s a form of machine learning and therefore a branch of artificial intelligence. Depending on the complexity of the problem, reinforcement learning algorithms can keep adapting to the environment over time if necessary in order to maximize the reward in the long-term. So, similar to the teetering toddler, a robot who is learning to walk with reinforcement learning will try different ways to achieve the objective, get feedback about how successful those ways are and then adjust until the aim to walk is achieved. A big step forward makes the robot fall, so it adjusts its step to make it smaller in order to see if that's the secret to staying upright. It continues its learning through different variations and ultimately is able to walk.


Organisations are beginning to find cyber threats more effectively

Organisations are beginning to find cyber threats more effectively image
“Threat hunting is part of nonstandard security operations. It’s a good combination of threat intelligence and hypothesis generation based on likely and probable locations of intrusions into a network. Once an organisation begins consuming threat intelligence, natural hunting begins to take place,” said Robert M. Lee, SANS certified instructor and co-author of the report. Rob T. Lee, co-author and curriculum lead for digital forensic and incident response training, SANS Institute added: “One of the most notable highlights of the 2018 survey is that it demonstrates a more accurate use of threat hunting in many organisations. This change in threat hunting practices has increased since the last survey in 2017, which showed many organisations typically were hunting incorrectly through traditional intrusion detection. In this year’s survey, many more organisations were using proper threat intelligence to help identify the best locations inside an organisation’s network to look for anomalistic behaviours that are direct indicators of threats.”


“Everything is fine” vs. “we’re doomed” isn’t the way to frame election security

russian hacking us election  putin voting fraud hacked
Humans are really bad at assessing risk. We tend to fixate on catastrophic but unlikely occurrences—like terrorism, for example—while ignoring mundane risks that cause cumulative harm such as eating poorly, or not maintaining bridges, or failing to save for retirement. This difficulty in assessing and responding to risk is especially pronounced in information security, where non-technical people, in particular, find themselves forced to choose between extreme paranoia (and thus, a defeatist attitude) and unrealistic optimism ... Mitigations that improve, but by definition do not perfect, security are worth nothing if we are not able to calibrate our trust to the level of security they provide. I trust that the lock on the front door to my apartment is good enough to withstand all but the most determined attacks. But if someone with a battering ram, explosives, or a talented black bag team want to get into my apartment, I know that I can't prevent intrusion by those kinds of attackers. Nevertheless, I don't stay awake at night obsessing over unlikely threats or threats I cannot defend against.


Analytics Translator – The Most Important New Role in Analytics


The role of Analytics Translator was recently identified by McKinsey as the most important new role in analytics, and a key factor in the failure of analytic programs when the role is absent. As our profession of data science has evolved, any number of authors including myself has offered different taxonomies to describe the differences among the different ‘tribes’ of data scientists. We may disagree on the categories but we agree that we’re not all alike. Ten years ago, around the time that Hadoop and Big Data went open source there was still a perception that data scientists should be capable of performing every task in the analytics lifecycle.  The obvious skills were model creation and deployment, and data blending and munging. Other important skills in this bucket would have included setting up data infrastructure. And finally the skills that were just assumed to come with seniority, storytelling, and great project management skills. Frankly, when I entered the profession, this was true and for the most part, in those early projects, I did indeed do it all.


Shell CTO Yuri Sebregts talks about using AI to amplify the human impact of its workforce


As well as the predictive maintenance project, the company has also created a service called Machine Vision using Azure-based deep learning technologies that combines CCTV footage with internet of things devices to alert employees at its service stations to potential safety hazards occurring on the forecourt in real time, such as someone lighting a cigarette or driving erratically close to a petrol pump. There is also potential for this technology to be applied in a stocktaking context in Shell’s warehouses and petrol stations, says Sebregts, so that staff can intervene and replenish suppliers, as and when needed. Beyond its retail sites, robotics is already commonly used to install equipment in offshore environments where it would be hazardous and impossible to send humans, and Sebregts also sees potential for AI to enhance how that work is carried out in future.


How Data Security Improves When You Engage Employees in the Process

A great example of inclusive programming is anti-phishing training, which teaches employees to identify fraudulent attempts to obtain sensitive information electronically, often for malicious reasons, under the guise of a trustworthy source. In order for this training to be successful, employees must learn how to make choices when they receive potential phishing emails. Experiential training with real-world simulations — where employees build their knowledge base and ability to make choices in the moment, as it relates to them and their learning style — has proved to be effective. According to the research from Herman Miller Learning Pyramid, learning by doing yields a 75% knowledge retention rate compared with 5% relying on lectures. Giving employees a choice of password management software to use to achieve company security may also foster an environment of partnership versus rigid control.



Quote for the day:


"The leader has to be practical and a realist, yet must talk the language of the visionary and the idealist." -- Eric Hoffer