Daily Tech Digest - December 04, 2018

This technology allows paralysed people to communicate with the power of their mind

 A participant in the BrainGate clinical trial plays “Ode to Joy” on a virtual keyboard interface.
“In this study, we’ve harnessed that know-how to restore people’s ability to control the exact same everyday technologies they were using before the onset of their illnesses. It was wonderful to see the participants express themselves or just find a song they want to hear,” he says. The investigational BrainGate BCI includes a baby aspirin-sized implant that detects the signals associated with intended movements produced in the brain’s motor cortex. Those signals are then decoded and routed to external devices. BrainGate researchers and other groups using similar technologies have shown that the device can enable people to move robotic arms or to regain control of their own limbs, despite having lost motor abilities from illness or injury. Two of the participants in this latest study had weakness or loss of movement of their arms and legs due to amyotrophic lateral sclerosis (ALS), a progressive disease affecting the nerves in the brain and spine that control movement.



IBM boosts AI chip speed, bringing deep learning to the edge

istock-1002569496.jpg
IBM is unveiling new hardware that brings power efficiency and improved training times to artificial intelligence (AI) projects this week at the International Electron Devices Meeting(IEDM) and the Conference on Neural Information Processing Systems (NeurIPS), with 8-bit precision for both their analog and digital chips for AI. Over the last decade, computing performance for AI has improved at a rate of 2.5x per year, due in part to the use of GPUs to accelerate deep learning tasks, the company noted in a press release. However, this improvement is not sustainable, as most of the potential performance from this design model--a general-purpose computing solution tailored to AI--will not be able to keep pace with hardware designed exclusively for AI training and development. Per the press release, "Scaling AI with new hardware solutions is part of a wider effort at IBM Research to move from narrow AI, often used to solve specific, well-defined tasks, to broad AI, which reaches across disciplines to help humans solve our most pressing problems."


Cyborg: Manufacturing the man-machine

Cyborg
Technology has evolved to aid manufacturing processes and today, it is helping significantly to eliminate the chances of human error. But despite the increasing role, “bots will not dominate shop floors or replace the physical workforce. The role of humans is, and will remain, critical at all stages of manufacturing – right from designing to body operations, final vehicle assembly, quality checks and delivering the finished product,” feels Subramanian.  It’s still cheaper to use humans rather than machines in India. “So while automation is increasing, labour arbitrage plays out in India,” says Sid Chatterjee, vice-president, GreyOrange, a startup that supplies robots to automate supply chains in warehouses, distribution and fulfilment centres to a variety of customers such as Flipkart, Mahindra and Aramex. There is also apprehension about using high-tech, imported gear on the shop floors. “Deployment of machines is low in India. Europe makes lot of automated gear but with limited market size in India, European companies don’t provide much on-ground support. If something goes wrong, you end up with a white elephant,” explains Milan Sheth, lead, intelligent automation, EY India.


MongoDB wants to get the database out of your way

One of the big ones was transaction support -- asset transactions. This is an interesting feature because most Mongo applications don't need it very often, but it makes people a lot more comfortable because there are cases where it is useful. Another was formal synchronisation, because if you're building a mobile app you want to keep a subset of our data so, for example, if you're writing a story when you're on a train it's in the background synchronising your data so if you lose connectivity on the train it still works, it still keeps going. So, for mobile it turned out to be a pretty big story. We have also done a ton of stuff with the analytics -- that is one trend we see a lot of with the merging of transactional and analytical workloads. A lot of people don't want to have separate workloads for analytical and transactional. They want to be doing real-time analytical on their transactional systems. And that works because we have this feature called Workload Isolation.


Katrina Roberts, a CIO at American Express, Shares Hiring Strategy

Image: Shutterstock
To attract more experienced people, I spend a lot of time at conferences. I just attended the Grace Hopper conference, and go every year. We're big on partnerships, with the Society of Hispanic Professional Engineers and organizations for African-Americans and Transgender professionals. We sponsor hackathons – participants work on an AmEx business problem and learn about our technology and what we're doing. We're selling ourselves to them as much as they're selling themselves to us. ... Attitude is important; it means looking beyond what degree they've got and finding out their aptitude and attitude to be a technologist. The people who work for me want to learn, want to drive change – and want to change themselves. I spend time at the Hopper conference talking to candidates who have degrees in psychology and philosophy and not just computer science backgrounds. We've had to broaden our thinking beyond just hiring people with specific technology skills.


Six months on from GDPR

null
Organisations must trawl through their entire data infrastructure to create and maintain a constant, accurate map of their data. They need to pay particular attention when it comes to their third-party systems such as CRM, HR, infrastructures or platforms as-a-service or analytics that are based in the cloud.  This is especially important as they will then need to assess the GDPR readiness of their cloud provider as a data processor and make sure their contract includes a data processing agreement. Similarly, data controllers need to ensure that they can erase the data from their cloud providers when they stop using the cloud service.  As consumers will be able to request information on, or the deletion of, all the personal data a company holds about them, the data controller must ensure that they can meet this kind of requirement through their cloud provider. Consumers now hold more power over their data than before and as we’ve seen with the recent complaints against Oracle, Criteo and others, they are exercising it.


The growing use of AI will increase data usage exponentially. As part of Singapore’s smart nation initiative, the government has planned to invest up to S$150m from the National Research Foundation on AI over five years through the AI Singapore programme. While first-generation AI architectures have historically been centralised, Equinix predicts that enterprises will enter the realm of distributed AI architectures, where AI model building and model inferencing will take place at the edge, physically closer to the origin source of the data. Beyond data protection, managing data in the cloud has its own set of rules, and if not done right the cost, complexity, and risk can bring down the house, Sommer said, noting that the shift from on-premise and legacy datacentres should be done at a pace organisations are comfortable with.


Smartphone Users in India to Double by 2022: Cisco Visual Networking Index

Cisco
This proliferation of smart devices will propel India’s per capita traffic consumption to nearly 14 Gigabytes by 2022 from 2.4 Gigabytes in 2017 and is in line with the global trend where more IP traffic will cross global networks than in all prior ‘internet years’ combined up to the end of 2016. In other words, more traffic will be created in 2022 than in the 32 years since the internet started. India will be a major driver of this with the total number of internet users to reach 840 million (60% of the population) by 2022 from 357 million (27% of the population) in 2017. “By 2022, the smartphone data consumption will increase by 5X in India – which proves the dominance of smartphones as the – communications hub for social media, video consumption, communications, and business applications, as well as traditional voice. As the usage and expectations increases, the opportunity in the market for service providers rises simultaneously.



3 Ways Digital Adoption Can Make You a Better Business Leader

3 Ways Digital Adoption Can Make You a Better Business Leader
In order to become an effective leader in any digital transformation, it is critical that you do everything in your power to stay updated with this ever-changing field. Start by keeping up with influencers in the tech industry, such as Isaac Sacolick, CIO of StarCIO, who runs a digital transformation blog. Or check out The Edge podcast that is hosted by Michael Krigsman. Encourage others in your organization to follow in your footsteps to create the kind of business culture that embraces change. This can be done in several ways, such as offering opportunities to participate in continued education courses on digital practices, or allowing employees to attend relevant conventions on the subject. Having cutting-edge technology is becoming a necessity in today’s data-driven business climate. That said, your ability to understand the implications of adoption and the best ways to implement it is crucial. The CEOs and executives of tomorrow will need to be highly knowledgeable in the ways that lead to innovation and progress.


3 key elements to make data monetization possible

According to a recent study by McKinsey, many respondents lack a data-and-analytics strategy at their companies even when they recognize the need. Sixty-one percent of respondents who recognize that data and analytics have affected their core business practices say their companies either have not responded to these changes or have only taken ad hoc actions rather than develop a comprehensive, long-term strategy for analytics. Businesses that are not realizing the full potential value of data are leaving untapped opportunities on the table and are at real risk of being disrupted by companies that are driving forward with an analytics agenda. While the value of it is undeniable, achieving transformational-level success can be tricky and navigating the journey of infrastructure, vendor solutions, processes, techniques and tools can be challenging. Organizations that are pursuing analytics often make the critical mistake of focusing on the technology rather that starting with the strategy and desired business outcomes, which can ultimately hinder their ability to monetize, or in other words, get value from their data.



Quote for the day:



"See opportunity in every disaster, and transform that negative situation into an education..." -- Ryan Holiday


Daily Tech Digest - December 03, 2018

Marriott data breach reactions
Marriott Hotels should have identified this breach through their cyber due diligence of Starwood in 2016 when it acquired the company. As result of buying a breach they will face a number of challenges at a board level around the levels of governance and diligence within the business. Had it performed a detailed compromise assessment as part of its due-diligence activity, the organisation’s board would have been informed of the breach and been able to make a decision based on risk or put other warranties in place.Since the compromise started in 2014, the breach doesn’t fall under the remit of GDPR. However, the fallout would be incredibly severe under this regulation, and therefore any organisation looking to undergo an M&A deal now or in the future should learn from this example and ensure a comprehensive cyber security and compromise assessments are carried out to inform their understanding of risk.


What's the best way to make the most of the cloud?

That ability to scale is critical to the organisation. Six years ago, the business had clear peaks in traffic -- the launch of its world-famous book of records every September and Guinness World Records Day in November. Today, GWR is less reliant on publishing and operates more like a digital consultancy and its traffic peaks are unpredictable. Howe gives an example. "On the first day we went live with the new AWS infrastructure, there was a press release for the largest unlimited wave surfed by a woman," he says. "It was huge news in the surfing community and within a few hours we'd received four times our normal daily web traffic. Yet we were able to meet that demand comfortably by just turning on the auto-scaling capability of the cloud." As well as scalability and flexibility, Howe says the cloud provides other benefits. "It allows us to be more dynamic as a team and to think more carefully about where we should focus our attention," he says. "It gives us better transparency in terms of costs, too."


How to buy SD-WAN technology: Key questions to consider when selecting a supplier

question marks
The first strategic choice is deciding what kind of partner you want to deploy and support your SD-WAN architecture. IT organizations can work directly with the leading SD-WAN technology providers and their channel partners, or purchase a managed SD-WAN service from a service provider such as AT&T, Verizon, CenturyLink, Comcast and many others. Most organizations will benefit from an experienced channel partner to integrate SD-WAN into their existing branch/WAN infrastructure, which may include routers, WAN optimization appliances, firewalls and other network security elements. Many organizations will want to outsource SD-WAN technology and related bandwidth decisions to a managed service provider. Organizations that plan to implement an internally developed (non-managed) SD-WAN solution need to examine several key issues for deployment. These include a review of their branch WAN/LAN architecture, WAN bandwidth requirements and providers, and, of course, selecting an SD-WAN technology.


Prepare for Takeoff: The Future of ERP

A truism dating back to the ancient Greeks notes that luck is when opportunity meets preparation. It's hard to argue that Plattner got lucky with his vision. Rather, it's a case of timing really being everything. After all, a complete ERP redesign should take roughly a decade. That timeline matters a great deal now in light of converging forces in the business world. On the one hand, global competition puts greater emphasis on operational efficiency. With margins tightening, companies need to optimize their processes, and especially, the customer experience. A real-time ERP can play a major role in accomplishing both of these objectives. Another major driver right now is regulation. There are numerous regulatory changes across several industries that are currently putting pressure on organizations to get much more visibility into issues like revenue recognition, privacy and process. In all of these scenarios, a real-time ERP can play a significant role in helping organizations not only stay compliant, but also reinvent critical business processes.


New forms of governance needed to safely and ethically unlock value of data


“Legally speaking, if we’re going to be setting up data trusts with massive amounts of data and serious risks in terms of data security and the implications on people’s privacy, questions about consent, we’re going to have to have data protection impact assessments,” she added. In terms of how decisions are made about data usage, the trusts could potentially help strike a balance between purely giving organisations control, which could encourage monopolistic behaviour and further entrench the power imbalance, and purely giving individuals control, which would require significant effort on their part to manage the vast amounts of data held on them. “I think the notion that an individual wants to literally manage large amounts of data about themselves is a strange sort of idea,” said Roger Taylor, chair of the Centre for Data Ethics and Innovation, which was set up by the DCMS in June 2018 to help create ethical frameworks for the use of emerging technologies.


Google Assistant and Smart Display get several new features for the holidays

The "Pretty Please" feature headlines Google's list. With it enabled, saying "Please" with your Assistant commands can now produce some unique responses from her, like "Thanks for asking so nicely." The idea pitched last summer is that you're teaching younger users to mind their manners when asking for things. And in case Google AI ends up taking over the world like something out of a sci-fi movie, you'll have banked some good faith that may save you from Martian slave pits and re-education camps. Pretty Please is enabled now in the app, and for Google's smart speakers and smart displays. There's no setting to enable or disable -- the Assistant just has additional responses when you ask nicely. Google's also added the long-awaited ability to create and manage lists with the Assistant, instead of having to use the Keep Notes app (Android, iOS) separately. The company says that it will be adding Keep Notes integration soon, as well as support for Any.do, Bring!, and Todoist.


From Warfare to Outsourced Software Development


Customers are not enemies and projects are not really born as conflicts, but in order to wage your “friendly” attack on your customer, you need to know enough about his territory and capacity. This fore-knowledge is what we call Project Intelligence. There is a Reconnaissancemission needed here. Never take it as a spying mission, it’s just a matter of ethical deep observation of things as they are naturally exposed to us by the customer. But who are the agents who will do it for you? Just look around you. Sales and presales are the spearhead in winning the battle of a new contract. Before the win takes place, they spend some good time on the customer’s territory setting up connections, knowing things and people and striving to make it happen: Contracting. They, in this way, can be exposed to quite a good deal of information not only on the pure business grounds, but also on the business politics of the new front, its weaknesses and strengths.


Cloud investments should be to boost agility, not cut costs

Since cloud implementation, they have reached several new groups they wouldn’t have been able to beforehand, enabling better two-way communication and capturing all consumer interaction data. From a developer’s point of view, cloud adoption has been instrumental, as they are able to create new functionality on the fly and deploy it easily and economically to their users. This scenario is repeated in other industries and enterprises, such as Cepsa, the second largest petroleum and chemical company in Spain. Cespa had already integrated new technologies into its operations, but it still needed to reinvent and streamline processes, something that it was finally able to accomplish by using iOS mobile apps built on an open cloud platform. Now, these apps allow service station workers to anticipate their own needs and place new supply orders with a single click. Moreover, direct sales representatives also benefit because they can manage every customer interaction on their smartphones, speeding up order processing, approval and fulfilment.


AWS Aims to Speed and Simplify Robot Development

RoboMaker’s cloud extensions are important because they enable robots to do far more than they could using only local resources. The cloud extensions are written as ROS packages so developers familiar with ROS can easily embed them in their applications. Zhu said Amazon currently has five integrated services including Amazon Lex and Amazon Polly which enable developers to add natural language conversation capabilities to their robots, the Amazon Kinesis data streaming service, the Amazon Rekognition service for facial recognition and object detection, and the AWS Cloud Watch service for near real-time live streaming of telemetry and log data for monitoring individual robots and fleets. RoboMaker also includes a simulation environment, which is necessary because not all developers working on robotic applications have access to the target device.


How the cybercrime and cyberwar landscape is constantly changing

To the average person is pretty unlikely that a state-backed hacker is going to come after you, unless you're a really high-value target. To the average person it's quite a rare kind of a risk. Obviously if you are, I don't know, working in aerospace or biotech or robotics, one of those kind of companies, then there's a reason or chance that someone's going to try and hack your systems to steal your intellectual property or just cause trouble. In terms of the bigger risk, so clearly down the line there's a lot of worry about cyber warfare that hackers could actually break into things like power systems or banks and cause chaos that way. That's clearly a huge risk, but the likelihood is very low. What's going to happen day to day is you're more likely to run into a scammer or maybe get ransomware on your PC or something like that. Those are the kind of the everyday risks, which are incredibly annoying and a real problem if suddenly your PC is encrypted and you can't get to your family photos or your work you're doing. Those are kind of everyday risks.



Quote for the day:


"Leadership matters more in times of uncertainty." -- Wayde Goodall


Daily Tech Digest - December 02, 2018

How Technology is Changing the Lending Landscape
For lenders, adoption of tech-enabled risk modelling techniques, for instance, removes limitations associated with manual credit assessment and directly translates to speedier disbursement of credit to qualifying applicants. Beyond enhancing internal processes, technology is enabling lenders to target potential applicants based on their Internet search and social media behaviour patterns and helps expand sales pipelines and reach beyond target markets. ‘Pay-as-you-use’ tech offered by new age fintech vendors greatly helps level the playing field for newer players against well-heeled larger players. The arrival of tech-enabled alternative financing platforms like ‘Peer to Peer’ and ‘New to Credit’ lending, is also increasing choice for the borrower. In the pre-tech era, borrowing was like buying groceries from the only shop in your neighbourhood. You had to buy what was available. Lenders, banks and non-banking finance companies, offered a fixed set of loan options and borrowers had no other choice but to settle for whatever was available.


Getting your Fintech Ready for Investment with haysmacintyre

For many start-ups, an ‘angel’ investor will be the preferred choice. These investors can offer extensive experience and expertise, helping the business leaders through unfamiliar territory, whilst standing back from the day-to-day management of the business. Often referred to as “patient capital”, angel investors are generally less concerned with rapid returns, supporting the business throughout its growth. Alternatively, fintech companies may turn to venture capital (VC) investment. It is, however, worth considering that the VC investor will want to exit at some point down the line, many frequently departing within five years of Series A investment. It should also be kept in mind that they may want some control over the day-to-day operation of the business, and would possibly want a position on the board. Start-up businesses often receive investment that is particularly hands-off, where the investors pay little attention to day-to-day matters. As the scale of investment increases, businesses should prepare for this dynamic to change.


Redesigning the Office App Icons to Embrace a New World of Work


Today’s workforce includes five generations using Office on multiple platforms and devices and in environments spanning work, home, and on the go. We wanted a visual language that emotionally resonates across generations, works across platforms and devices, and echoes the kinetic nature of productivity today. Our design solution was to decouple the letter and the symbol in the icons, essentially creating two panels (one for the letter and one for the symbol) that we can pair or separate. This allows us to maintain familiarity while still emphasizing simplicity inside the app. Separating these into two panels also adds depth, which sparks opportunities in 3D contexts. Through this flexible system, we keep tradition alive while gently pushing the envelope. ... To reflect this in the icons, we removed a visual boundary: the traditional tool formatting. Whereas prior Office icons had a document outline for Microsoft Word and a spreadsheet outline for Excel, we now show lines of text for Word and individual cells for Excel.


3 Signs of a Good AI Model

3 Signs of a Good AI Model
The first step in understanding how to achieve XAI is to understand what a model is and how it works. Simply stated, a model is a set of transformations that convert raw data into information, most often by applying statistics and advanced mathematical constructs such as calculus and linear algebra. What makes AI models different from traditional data transformations is that the model is constructed by employing algorithms to expose patterns from historical data; those patterns form the basis for the mathematical transformation. Traditional data transformations are most often a set of directives and rules established and programmed by a developer to achieve a specific purpose. Because AI models learn from having more data, they can be regenerated periodically to sense and adjust to changes in the underlying behaviors associated with the transformation. One of the strengths of AI is that the process of creating a model can identify patterns that are not obvious and intuitive by looking at the data.


The future of cash in Canada

The Bank of Canada staff economists considered all of this in a discussion paper released this fall called "Is a Cashless Society Problematic?" The paper cites the consistent decline of cash payments in Canada for decades. It also mentions an analysis by Forex Bonuses, which declared Canada the top country in the world embracing cashless technology. A very close second was Sweden, a country where the government is now studying how going cashless could affect the nation. The findings, referenced in the paper released by the Bank of Canada, focused on indicators such as the number of credit cards per person and the volume of cashless transactions. To save drivers time and to reduce traffic congestion, New York state switched to cashless toll booths at Grand Island, where millions of tourists travelling to Niagara Falls pass through every year. For users without a pass, the state mails the registered owner a bill. The complication is mailing bills to Canadian addresses attached to license plates. The state can't access that information, which they didn't fully consider when implementing the system.


Enterprises face these 3 challenges while adopting AI


Several barriers to AI adoption, ranging from analyzing disparate data to identifying the right AI use case to hiring the best talent, hold back companies from seizing AI opportunities. For quite some time now, there has been a lot of buzz around AI and its promise to disrupt industries altogether. From digital assistants to robotic process automation to self-driving cars, AI has offered cool and innovative applications, which have only been the subject of science fiction. Today, AI has reached a level of precision where it can understand human emotions too. The power of AI to make machines ‘smart’ and ‘intelligent’ has triggered a lot of industries to invest in AI projects. The decision of leveraging AI to aid digital transformation is pretty understandable. But, companies should first analyze the potential barriers to AI adoption so that they can enjoy successful AI implementation. While companies think of leveraging AI for transforming their existing workflows, they should keep in mind these potential hurdles, and plan their journey with AI accordingly.


IoT Trends to Watch for in 2019

Throughout 2018, the staggering growth of digital assistants, such as Amazon’s Alexa and Google Home, showed that smart devices are here to stay. While the concept of smart toasters has been a long-running joke, analysts forecast strong growth among consumer-facing IoT devices of all shapes and sizes. A bit of smart technology can simplify our home lives; automated vacuum cleaners have long been popular, but adding in some smart technology can make them even more useful. Smart technology is already make home security systems far more capable, and bringing smart technology into the kitchen can make it easier to save time while preparing meals. Businesses are certainly looking into investing in smart technology, and smart desks and smart walls are expected to become far more common in 2019. If there’s any perceived benefit of adding IoT technology to a consumer device, someone is likely to offer it for sale, even if the value added is dubious. Edge devices are have become staples of typical IoT installations, as they allow for more efficient operations and better responsiveness. 


Azure Service Fabric Mesh: A Platform for Building Mission Critical Microservices


The Service Fabric Cluster provides you with a reliable and scalable cluster of VMs running the service fabric runtime into which you deploy and manage your applications/services (containerized or non-containerized) via a highly available cluster endpoint. The service fabric runtime makes the service placements decisions based on the integration it has with the underlying azure infrastructure, making them reliable. When using Azure Service Fabric Clusters, you have to administrator access to not only your cluster, but also the VMs that make up the cluster. You pick the VM SKUs to meet your needs, you get to decide on the network security rules and the autoscale rules by which you want to scale the cluster. You can set up automatic upgrades of the service fabric runtime and the VM operating system. With this offering, you are only paying of the VMs, Storage and Networking resources you use, the service fabric runtime is effectively free. It is great fit for customers/ISVs who want need full control for the infrastructure.


Adding Object Detection with TensorFlow to a Robotics Project


My robot uses the Robot Operating System (ROS). This is a de facto standard for robot programming and in this article we will integrate TensorFlow into a ROS package. I'll try and keep the details of the ROS code to a minimum but if you wish to know more can I suggest you visit the Robot Operating System site and read my articles on Rodney. Rodney is already capable of moving his head and looking around and greeting family members that he recognises. To do this we make use of the OpenCV face detection and recognition calls. We will use TensorFlow in a similar manner to detect objects around the home, like for instance a family pet. Eventually the robot will be capable of navigating around the home looking for a particular family member to deliver a message to. Likewise imagine you are running late returning home and wished to check on the family dog. With the use of a web interface you could instruct the robot to locate the dog and show you video feed of what it's doing. Now in our house we like to say our dog is not spoilt, she is loved.


The Digital Twin Organization: Can Enterprise Architecture Help?

The Digital Twin of the Organization is a concept created by Gartner. Quite simply, it is predicated on using a digital representation of an organization (its business model, strategies etc.) to better plan and execute a business transformation initiative. The whole idea behind the digital twin concept, and the reason why it is so useful, is that it offers a virtual model that can be analyzed and tweaked more easily than the real thing. The new insights and efficiencies you uncover this way can in turn be used to improve the organization.  Model is the key word here. Models are massless, frictionless, virtually free, reusable, and – importantly – they are also the lifeblood of enterprise architecture. Thus, EA is by default positioned to play a key part in taking the Digital Twin of the Organization from concept to reality. We have been arguing the importance of a model-based approach to business change for quite some time on this blog, now it seems the future is starting to catch up. Let us have a more detailed look at how exactly EA helps, and offer some examples based on the BiZZdesign suite.



Quote for the day:


"To be able to lead others, a man must be willing to go forward alone." -- Harry Truman


Daily Tech Digest - December 01, 2018

Angry man yelling on phone while reading vintage printer paper report. Photo by SHutterstock
Blockchain has been wildly mis-sold, but underneath it is a database with performance and scalability issues and a lot of baggage. Any claim made for blockchain could be made for databases, or simply publishing contractual or transactional data gathered in another form. Its adoption by non-technical advocates is faith-based, with vendors' and consultants' claims being taken at face value, as Eddie Hughes MP (Con, Walsall North) cheerfully confessed to the FT recently. "I'm just a Brummie bloke who kept hearing about blockchain, read a bit about it, and thought: this is interesting stuff. So I came up with this idea: blockchain for Bloxwich," said Hughes. As with every bubble, whether it's Tulip Mania or the Californian Gold Rush, most investors lose their shirts while a fortune is being made by associated services – the advisors and marketeers can bank their cash, even if there's no gold in the river.


Building Resilient Data Multiclouds


Resilience is risk mitigation that is engineered into all your IT assets. It’s the confidence that your infrastructure won’t fail you ever, especially in times of crisis. Resilience that’s baked into the True Private Cloud ensures that businesses can weather those once-in-a-lifetime “black swan,” “perfect storm,” and other disruption scenarios that can put them and their stakeholders out of business permanently. To evolve toward this resilience architecture, enterprises must take steps to ensure that their migration of data, analytics, and other IT infrastructures to the cloud environments are comprehensively resilient. The path to the True Private Cloud requires a keen focus on building unshakeable resilience into distributed data assets. Management information systems and enterprise data warehouse systems have historically been viewed as “second-class citizens” among IT infrastructure platforms. Occasional failures of these systems were viewed as tolerated events.


Global Financial Services Bullish On AI, The 'Disruptive Tech' Frontrunner


AI and its application through machine learning is being increasingly used to automate processes such as credit decision-making and customer interaction as well as help detect fraud, money laundering and even terrorist activity. Capital markets-focused organizations such as investment banks are the furthest down the road in the financial services industry in adopting new disruptive technologies, with a little over half (51%) saying that AI, ...  and just 17% among those in the private wealth industry. Stephanie Miller, Chief Executive Officer of Intertrust, commenting in the wake of the findings said: “With the hype surrounding disruptive technology in the financial sector it is easy to lose sight of reality. The findings from this study suggest that while the industry is positive towards new technology such as AI, blockchain and robotics, only a minority of firms are currently putting it to use and the speed of travel remains cautious.”


Geospatial Data Brings Value Across Industries

Geospatial Data Brings Value Across Industries
Though it may seem like a highly technical concept, most people use some type of geospatial data system every day because such programs are used to route Uber drivers, assess credit risk and lending rates based on zip code, and determine insurance rates by identifying homes at risk of flooding, earthquakes, and other natural disasters. Even kids use geospatial data to play games like Pokemon Go. Geospatial information is everywhere and, in a world where everyone is attached to a smartphone, we’re constantly connected to it. Put simply, geospatial data just means that the information set is tied to zip codes, addresses, or coordinates, among other possibilities. It’s a map or an address book, reinterpreted for a digital ecosystem. Though there are plenty of groups building geospatial data sets, one of the factors that has most contributed to this new digital world is the availability of open data sets.


Built for realtime: Big data messaging with Apache Kafka

big data messaging system / information architecture / mosaic infrastructure
Apache Kafka's architecture is very simple, which can result in better performance and throughput in some systems. Every topic in Kafka is like a simple log file. When a producer publishes a message, the Kafka server appends it to the end of the log file for its given topic. The server also assigns an offset, which is a number used to permanently identify each message. As the number of messages grows, the value of each offset increases; for example if the producer publishes three messages the first one might get an offset of 1, the second an offset of 2, and the third an offset of 3. When the Kafka consumer first starts, it will send a pull request to the server, asking to retrieve any messages for a particular topic with an offset value higher than 0. The server will check the log file for that topic and return the three new messages. The consumer will process the messages, then send a request for messages with an offset higher than 3, and so on.


Expert Excuses for Not Writing Unit Tests

Book Cover
Many studies do show a correlation between LoC and the overall cost and length of development, and between LoC and number of defects. So while it may not be a precise indication of progress, it is not a completely useless metric. The lower your LoC measurement is, the better off you are in terms of defect counts. For a tool to calculate this for you try https://github.com/boyter/scc/ which will also give you a COCOMO estimation. Be sure to run it over projects that have tests and see how much additional cost the tests add. Do this internally if you can with projects that have tests and point out that the tests add some percentage of cost. If you can cherry pick projects to make this look worse the better off you will be. If someone challenges that the project with tests was more successful point out using the same model that the project cost more. More money spent means more quality to most people. If you mix metaphors and ideas here you can also impress and confuse people to the point they will be afraid to challenge you further. Be sure to point out that adding tests means writing more code which takes longer, which also impacts cost. Also be sure to point out that while tests are being written that nobody will be fixing bugs. This is usually enough of an argument to stop everything dead in its tracks.


Confused by AI Hype and Fear? You’re Not Alone

hand raised in the middle of a wheat field
Although AI leaves the door open for other paths to machine intelligence, most advances towards this goal so far have been made using machine-learning algorithms. These have some key characteristics that separate them from other algorithms, and that will define the field if another route to AI is discovered in the near future. Machine learning is primarily concerned with algorithms that can make connections between various annotated data and their output. Crucially, they are also able to learn independently from new, varied output, thereby improving their models without the need for human intervention. This approach lends itself to many of AI’s defining use cases, such as computer vision and machine translation. It’s debatable whether any AI applications to date haven’t derived from machine learning in some way. Almost all current chatbots have been built by machine learning, but there is another approach that some data scientists are considering. Rule-based models are founded on linguistic systems that are developed by experts to imitate the ways humans structure their speech.


Man-in-the-disk attacks: A cheat sheet

Cue a recent discovery by researchers at the software research firm Check Point: An attack they dubbed "man-in-the-disk" (MITD) attacks, which exploit a weakness in Android's handling of external storage to inject malicious code. The exploit allowing MITD attacks has serious repercussions for Android users because it exists at a level that's integral to Android's design. If man-in-the-disk sounds similar to man-in-the-middle (MITM) attacks, it's because there are many ways in which the attacks are similar. Both involve intercepting and often modifying data for nefarious purposes--it's simply the scale that distinguishes between the two attacks. Check Point's researchers found a number of apps--including some from major distributors such as Google--that were vulnerable to MITD attacks. Researchers also managed to build their own apps that took advantage of the exploit.


Want A Bigger Bang From AI? Embed It Into Your Apps


A key element of application-centric AI: Context. Say a sales executive wants to call on important customers in several cities. AI can review the accounts and predict which customers might increase business after a sales call, based on past history, and suggest an itinerary that would maximize ROI from the trip. One common factor in all those buckets is that integrating AI and machine learning into applications lets the app take some type of action automatically. Automation allows many tasks to be performed without human intervention—and without human error, says Swan. AI systems can execute relatively straightforward actions, such as booking a rental car for that sales trip. They can also tackle harder tasks that normally require not only time, but also some level of expertise, such as optimizing business workflows, reviewing financials for anomalies, or finding expense report violations. Often there’s still a human review, but that review can often be done faster, and more accurately, with the AI’s assistance in laying all the groundwork, presenting recommendations, and providing the background, documentation, and reasoning behind those recommendations.


Why open standards are the key to truly smart cities


In collaboration with several partners, including The Open Group, academic institutions and industry players, bIoTope is running a series of cross-domain smart city pilot projects which will provide proofs-of-concept for a wide range of applications, including smart metering, smart lighting, weather monitoring, and the management of shared electric vehicles. These projects will reveal the benefits that can be realised through the use of IoT technology, such as greater interoperability between smart city systems. They will also deliver a much-needed framework for security, privacy and trust to facilitate responsible access to, and ownership of, data on the IoT. Ultimately, bIoTope will deploy smart city pilots in Brussels, Lyon, Helsinki, Melbourne and Saint Petersburg. It is hoped that these pilot schemes will showcase the sustainable business ecosystems that will generate value to end users, solution providers, municipalities and other stakeholders.



Quote for the day:


"Risk more than others think is safe. Dream more than others think is practical." -- Howard Schultz


Daily Tech Digest - November 30, 2018

Man-in-the-middle attacks: A cheat sheet

cybersecurityistock-952069328utah778.jpg
The concept behind a man-in-the-middle attack is simple: Intercept traffic coming from one computer and send it to the original recipient without them knowing someone has read, and potentially altered, their traffic. MITM attacks give their perpetrator the ability to do things like insert their own cryptocurrency wallet to steal funds, redirect a browser to a malicious website, or passively steal information to be used in later cybercrimes. Any time a third party intercepts internet traffic, it can be called a MITM attack, and without proper authentication it's incredibly easy for an attacker to do. Public Wi-Fi networks, for example, are a common source of MITM attacks because neither the router nor a connected computer verifies its identity. In the case of a public Wi-Fi attack, an attacker would need to be nearby and on the same network, or alternatively have placed a computer on the network capable of sniffing out traffic.


Technical Debt Will Kill Your Agile Dreams

Bad engineering decisions are in a different category to ones that were tactically made with full knowledge that the short-term priority was worth it. When it's clear that such a decision was, in fact, a tactical decision, it is much easier to convince people that refactoring needs to happen and the debt has to be paid off. Unfortunately, when the term is used as a polite way of saying bad engineering, it's unlikely there is any repayment strategy in place and it is even harder to create one because first, you need to convince people there is some bad engineering, then you need to convince people it is causing problems, then you have to consider a better approach and then convince various stakeholders of that too. Finally, you need to convince the investment is needed to refactor. It is like trying to win 5 matches in a row away from home when you don't even have your best players. 


3 Keys to a Successful “Pre-Mortem”


The concept of a pre-mortem has been around for years, but only recently have we seen it pick up speed in the engineering community. This is an activity which is run before starting on a big stage in a project, but after doing a product mapping and prioritization activity. Rather than exploring what went wrong after the fact and what to do differently in the future, the goal of a premortem is to identify potential pitfalls and then apply preventative measures. It’s a great idea, but for those new to the concept, it’s easy to overlook some important aspects of the process. To talk about what might go wrong is scary. It acknowledges many things are out of our control, and that we might mess up the things which are within our control. To talk about what might go wrong, and how to adapt to it, acknowledges the possibility of failure. As this is a rare thing in industry, if done initially outside of a structured activity, this can seem like trying to weasel your way out of work.



12 top web application firewalls compared

AWS WAF by itself does not offer the same sort of features you could expect from other solutions on this list, but coupled with other AWS solutions AWS WAF becomes as flexible as any competing solution. Existing AWS customers will see the most value in selecting AWS WaF due to the architecture benefits of staying with a single vendor. ... Each architecture comes with its own set of pros and cons, varying from the simplicity of the SaaS option to the fine-grained control over configuration and deployment with the appliance-based offerings. Barracuda’s various configurations offer very similar functionality, though there are some differences here and there. Server cloaking limits the amount of intel a potential attacker can gain on your configuration by hiding server banners, errors, identifying HTTP headers, return codes, and debug information. Server cloaking is available on all versions of the web application firewall, as is DDoS protection.


Creating a Turing Machine in Rust


A Turing machine is a mathematical model of computation that reads and writes symbols of a tape based on a table rules. Each Turing machine can be defined by a list of states and a list of transitions. Based on a start state (s0), the Turing machine works its way through all the states until it reaches a final state (sf). If no transition leads to the final state, the Turing machine will run ‘forever’ and eventually run into errors. A transition is defined by the current state, the read symbol at the current position on the tape, the next state and the next symbol that must be written to the tape. Additionally, it contains a direction to determine whether the head of the tape should move the left, right or not at all. To visualize this process, let’s take a look at a very simple Turing machine that increments the value from the initial tape by one. ... While this is a very simple Turing machine, we can use the same model to create machines of any complexity. With that knowledge, we are now ready to lay out the basic structure of our project.


Tech support scammers are using this new trick to bypass security software

Symantec describes this kind of attack technique as 'living off the land', whereby attackers exploit legitimate features in systems to hide malicious activity. In of itself obfuscation isn't malicious, but it can be used for malicious purposes. "There are many open source tools to obfuscate code as developers don't want their code to be seen by the users of their software. Similar is the case with encryption algorithms like AES. Such algorithms have wide usage and implementations in the field of data security," said Siddhesh Chandrayan, threat analysis engineer at Symantec. "Both these mechanisms, by themselves, may not generate an alarm as they are legitimate tools. However, as outlined in the blog, scammers are now using these mechanisms to show fake alerts to the victims. Thus, scammers are 'living off the land' by using 'inherently non-malicious' technology in a malicious way," he added.


Standout predictions for the cloud – a CTO guide

Standout predictions for the cloud – a CTO guide image
“Many businesses have previously shied away from true multi-cloud deployments by favouring public infrastructures due to the perceived expense of private platforms, rooted in the required expertise necessary to run them. However, recent technological developments that enable businesses to take a highly-automated approach have shown that this is now an outdated view of cloud infrastructure. When it comes to transforming with cloud technologies, multi-cloud is proving itself to be the correct endgame for businesses in all industries.” ... “Enterprises are eliminating all the “state” from their endpoint devices, where any changes are stored only temporarily on the device and are quickly and efficiently on-ramped to the organisation’s cloud. “One key benefit, aside from IT efficiency gains, is that it represents an elimination of the “dark data” that was previously stored in employees’ laptops or desktops. Suddenly, all this “dark” data is right at your fingertips – stored in the cloud– as a searchable, analysable and shareable repository.”



Typemock vs. Google Mock: A Closer Look

Writing tests for C++ can be complicated, especially when you are responsible for maintaining legacy code or working third-party APIs. Fortunately, the C++ marketplace is always expanding, and you have several testing frameworks to choose from. Which one is the best? In this post, we'll consider Typemock vs. Google Mock. We'll use Typemock's Isolator++ and Google Mock, the C++ framework that is bundled with Google Test, to write a test function for a small project. As we implement the tests, we'll examine the difference in how the frameworks approach the same problem. ... Fowler defines an order object that interacts with a warehouse and mail service to fill orders and notify clients. He illustrates different approaches for mocking the mail service and warehouse so the order can be tested. This GitHub project contains Fowler's classes implemented in C++ with tests written in Google Mock. Let's use those classes as a starting point, with some small changes, for our comparison.


Caching can help improve the performance of an ASP.NET Core application. Distributed caching is helpful when working with an ASP.NET application that’s deployed to a server farm or scalable cloud environment. Microsoft documentation contains examples of doing this with SQL Server or Redis, but in this post,I’ll show you an alternative. Couchbase Server is a distributed database with a memory-first (or optionally memory-only) storage architecture that makes it ideal for caching. Unlike Redis, it has a suite of richer capabilities that you can use later on as your use cases and your product expands. But for this blog post, I’m going to focus on it’s caching capabilities and integration with ASP.NET Core. You can follow along with all the code samples on Github. ... No matter which tool you use as a distributed cache (Couchbase, Redis, or SQL Server), ASP.NET Core provides a consistent interface for any caching technology you wish to use.


7 reasons why artificial intelligence needs people


As AI projects roll out over the next few years, we will need to rethink the definition of the “work” that people will do. And in the post-AI era the future of work will become one of the largest agenda items for policy makers, corporate executives and social economists. Despite the strong and inherently negative narrative around the impact on jobs, the bulk of the impact from the automation of work through AI will result in a “displacement” of work not a “replacement” of work – it’s easy to see how the abacus-to-calculator-to-Excel phenomenon created completely new work around financial planning and reporting, and enterprise performance management. Similarly, AI will end up accelerating the future of work and resulting displacement of jobs will be a transition already in place, not an entirely new discussion. As some work gets automated other jobs will get created, in particular ones that require creativity, compassion and generalized thinking.



Quote for the day:


"A single question can be more influential than a thousand statements." -- Bo Bennett