Daily Tech Digest - July 25, 2018

Are Initial Coin Offerings leaking money? image
The bottom line is that ICOs are being constructed with serious holes in them. Worse still, as the numbers from EY show, cyber criminals are taking advantage. Companies running ICOs are drawing huge sums of money in a very narrow window of time. If something goes wrong once the ICO is live, there little room for manoeuvring and precious little legal recourse that can realistically be taken. It’s the perfect conditions for cyber criminals to exploit. There is a high financial motivation and they’ve been drawn to ICOs like sharks drawn to churn in the water. The consequence of an attack? Well there’s two parties that could be affected there: the ICO organisers and the investors. Just one vulnerability is enough for attackers to steal investors’ money and do irreparable damage to the corporate reputation of the ICO organiser. The need to patch these holes is apparent but organisations are working on short time frames and might not realise where they are most vulnerable. So what are the main points of weakness?



Rolls-Royce Is Building Cockroach-Like Robots to Fix Plane Engines


Rolls-Royce believes these tiny insect-inspired robots will save engineers time by serving as their eyes and hands within the tight confines of an airplane’s engine. According to a report by The Next Web, the company plans to mount a camera on each bot to allow engineers to see what’s going on inside an engine without have to take it apart. Rolls-Royce thinks it could even train its cockroach-like robots to complete repairs. “They could go off scuttling around reaching all different parts of the combustion chamber,” Rolls-Royce technology specialist James Cell said at the airshow, according to CNBC. “If we did it conventionally it would take us five hours; with these little robots, who knows, it might take five minutes.” Rolls-Royce has already created prototypes of the little bot with the help of robotics experts from Harvard University and University of Nottingham. But they are still too large for the company’s intended use. The goal is to scale the roach-like robots down to stand about half-an-inch tall and weigh just a few ounces, which a Rolls-Royce representative told TNW should be possible within the next couple of years.


While tight integration is desirable, these systems have many of the same networking challenges as other data center deployments, including requirements for scalability, automation, security and management of traffic flows. Additionally, they need to link to other data center resources inside the data center, at remote data centers and in the cloud. Software-defined networking architecture can ease some of the scaling, automation, security and connectivity challenges of hyper-converged system deployments. Hyper-converged systems integrate storage, computing and networking into a single system -- a box or pod -- in order to reduce data center complexity and ease deployment challenges associated with traditional data center architectures. A hyper-converged system comprises a hypervisor, software-defined storage and internal networking, all of which are managed as a single entity. Multiple pods can be networked together to create pools of shared compute and storage.


Big Tech is Throwing Money and Talent at Robots for the Home

CES 2018
Whether or not the robots catch on with consumers right away is almost beside the point because they’ll give these deep-pocketed companies bragging rights and a leg up in the race to build truly useful automatons. “Robots are the next big thing,” said Gene Munster, co-founder of Loup Ventures, who expects the U.S. market for home robots to quadruple to more than $4 billion by 2025. “You know it will be a big deal because the companies with the biggest balance sheets are entering the game.” Many companies have attempted to build domestic robots before. Nolan Bushnell, a co-founder of Atari, introduced the 3-foot-tall, snowman-shaped Topo Robot back in 1983. Though it could be programmed to move around by an Apple II computer, it did little else and sold poorly. Subsequent efforts to produce useful robotic assistants in the U.S., Japan and China have performed only marginally better. IRobot Corp.’s Roomba is the most successful, having sold more than 20 million units since 2002, but it only does one thing: vacuum.



“Enterprise Architecture As A Service” - How To Reach For The Stars


Educate those implementing your value chain in best open practices. To deliver EA As A Service, one would do well by ensuring services are delivered through best practices that are open because this enables an organization to train easily, hire selectively, and produce consistently. Of course, one might ask about differentiation – the secret sauce for differentiation will be in your proven ability to deliver fast and on target! Apply the best in class tools proven to improve production capability. Similar to the about decided upon and utilizing a consistent set of best in class tools helps ensure that deliverables are consistent among clients and enable reuse which can improve speed and quality of delivery. Tools that support the best open practices add even more. Collaborate with partners to evolve the best open practices. Keeping in mind that differentiation comes in how well you deliver EA As A Service, collaboration on the best open practices provides an avenue to improve the best practices based on real experiences, improves market perception, and helps keep the bar raised for the industry.



Micropsia Malware


Controlled by Micropsia operators, the malware is able to register to an event of USB volume insertion to detect new connected USB flash drives. This functionality is detailed in an old blog post. Once an event is triggered, Micropsia executes an RAR tool to recursively archive files based on a predefined list of file extensions ... Most of the malware capabilities mentioned above have outputs written to the file system which are later uploaded to the C2 server. Each module writes its own output in a different format, but surprisingly in a non-compressed and non-encrypted fashion. Micropsia’s developers decided to solve these issues by implementing an archiver component that executes the WinRAR tool. The malware first looks for an already installed WinRAR tool on the victim’s machine, searching in specific locations. In the event a WinRAR tool is not found, Micropsia drops the RAR tool found in its Windows Portable Executable (PE) resource section to the file system.


FinTech’s road to financial wellness


It’s one thing to build up a pot of money (saving), but it’s also vital to make that money work hard for you (investing). Investment platforms like Moneybox and Nutmeg are giving everyday people the ability to make their money go further. Robo-advice in particular is making it considerably easier for consumers to invest their money in a way that matches their circumstances and attitude to risk. A key benefit of these start-ups is that they often have low minimum investment limits, which has led to younger generations and those with small savings pots being able to invest. ... A recent report found the insurance sector lags only behind the utilitiessector when it comes to disappointing customers with a poor online customer experience. These bad experiences are causing consumers to be put off dealing with insurance and insurers, meaning those consumers often aren’t financially protected. InsurTech companies like Lemonade, however, are using behavioural economics and new technology to create aligned incentives between the insurer and the customer.


Securing Our Interconnected Infrastructure

While it's encouraging that the House is leaning forward on industrial cybersecurity and committed to authorizing and equipping the Department of Homeland Security to protect our critical infrastructure, this still remains largely a private sector problem. After all, over 80% of America's critical infrastructure is privately owned and the owners and operators of these assets are best positioned to address their risks. In doing so, one of the questions companies are asking themselves is how to reconcile the risks and rewards of the interconnected world. Should we simply retreat into technological isolationism and eschew the benefits of connectivity in the interest of security, or is there a better way to manage the risk? The former is gaining a growing chorus, especially among security researchers. The latest call comes from Andy Bochman of the Department of Energy's Idaho National Labs. Bochman argued this past May in Harvard Business Review that the best way to address the cyber-risk to critical infrastructure is "to reduce, if not eliminate, the dependency of critical functions on digital technologies and their connections to the Internet."


The race to build the best blockchain


Things move incredibly fast in the blockchain world. Ethereum is three years old. Projects like Cardano and EOS, sometimes called "blockchain 2.0" projects, are already considered to be giants in the space. They have a combined token market cap of roughly $11.8 billion despite barely being operational. Cardano, which focuses on a slow and steady approach, with every iteration of the software being peer reviewed by scientists, is promising, but it hasn't fully launched its smart contract platform yet. EOS, an incredibly well-funded startup that launched in June, is another huge contender. However, EOS has a complicated governance process which caused a fair amount of trouble right after the launch, together with a slew of freshly discovered bugs. With an estimated $4 billion in pocket, EOS has the means to do big things, but it will take some time to see whether it can live up to the promise.  But there's already a new breed of blockchain startups coming. They've been working, often in the shadows, to develop new concepts and technologies that may make the promise of a fast, decentralized app platform a reality.


Serverless vs. containers: What's best for event-driven apps?


Event processing is very different from typical transaction processing. An event is a signal that something is happening, and it often requires only a simple response rather than complex edits and updates. Transactions are also fairly predictable since they come from specific sources in modest numbers. Events, however, can originate anywhere, and frequency of events can range from nothing at all to tens of thousands per second. These important differences between transactions and events launched the serverless trend and also precipitated the strategy called functional programming. Functional programming is pretty simple. A function -- or lambda, as it is often called -- is a software component that contains outputs based only on the input. If Y is a function of X, then Y varies only as X does. For practical reasons, functions don't store data that could change their outputs internally. Therefore, any copy of a function can process the same input and produce the same output. This facilitates highly resilient and scalable applications.



Quote for the day:


"Rarely have I seen a situation where doing less than the other guy is a good strategy." -- Jimmy Spithill


Daily Tech Digest - July 24, 2018

Rapid7 penetration tests reveal multitude of software flaws, network misconfigurations

Rapid7 penetration tests reveal multitude of software flaws
People are simply too predictable when it comes to creating passwords, and that’s even if an organization enforces password length and complexity standards. For example, “Summer2018!” meets the objectives of a password that is required to have at least one uppercase letter, one lowercase letter, one number, and one special character. But Rapid7 noted that it is one of the worst passwords a person can choose. Seasonal passwords came in as the third most common type of password. ... What do organizations most care about protecting? Despite the almost-daily data breach announcements, Rapid7 found that organizations are more concerned with protecting their own sensitive data such as internal communications and financial metrics than protecting the sensitive data of their customers or employees. As for organizations’ top five biggest priorities for protecting information, sensitive internal data is at the top with 21 percent, PII was second at 20 percent, authentication credentials were third at 14 percent, protecting payment card data came in at 7.8 percent, and bank account data was fifth at 6.5 percent.



Three AI And Machine Learning Predictions For 2019


The U.S. Army is currently using machine learning to predict when combat vehicles need repair. Think about it, there are millions of pieces of equipment that our Army uses each and every day. To keep track of the data involved, they are recruiting the help of an AI assistant. For the first implementation, a few dozen armored infantry transports will receive sensors inside of the vehicles’ engines. These sensors will record temperature and RPM and will transmit it to the software. Machine learning capabilities will look for patterns in the data that match engine failures in similar vehicles. What if your car did this? AAA might become obsolete if your car could tell you that the transmission is about to crap out on you. If the army is using the technology, I'm sure it won't be long till we see it in the civilian world. Automotive isn't the only industry that is seeing potential new uses for this tech, healthcare is about to see some changes too. As if Google wasn’t already on the AI map, they have begun to predict the likelihood of a patient’s death using machine learning – with staggering 95% accuracy.



data center technician
NVMe is a protocol for accessing high-speed storage media that’s designed to reduce latency and increase system and application performance. It's optimized for all-flash storage systems and is aimed at enterprise workloads that require low latency and top performance, such as real-time data analytics and high-performance relational databases. Storage vendors have been re-tooling their systems to support the faster interconnect protocol, and IBM is no exception. A key change in the FlashSystem 9100 is the use of small form factor NVMe drives. IBM redesigned its FlashCore technology to fit into a standard 2.5-inch SSD form factor with NVMe interfaces – a move that reduced the physical size of the drives by more than half. That redesign made an impression on Owen Morley, director of infrastructure at online dating platform Plenty Of Fish. Morley is among a group of users of IBM's all-flash storage who came together at an event in Mexico City to share their thoughts on the new 9100 system and the potential for NVME-accelerated storage in their own enterprises.



Edge computing will be vital for even simple IoT devices

The evolution of wearables required each generation to monitor and collate a greater number of measurements (raw data). Developers found optimal ways of doing this by processing raw data locally (on the edge of the application using the Bluetooth chips’ increasingly powerful onboard processors) and then forwarding to a smartphone app and the cloud (for data sharing and tracking) only the essential information (desired data). The technology enabled continuous (low-latency) monitoring, and the modest Bluetooth wireless throughput was sufficient to update apps and cloud servers of the key tracking information without requiring extended on-air duration that would otherwise be needed to stream raw data. Sending only the key information also minimized the impact on the user’s cellphone data allowance (data cost). Things go wrong, hackers never quit Because users didn’t always carry their smartphones, wearables had to operate autonomously when not connected. Resiliency was built into the systems. They didn’t depend on a continuous network or internet connection for successful operation (redundancy).


Nation-State Spear Phishing Attacks Remain Alive and Well

Nation-State Spear Phishing Attacks Remain Alive and Well
The trouble with phishing is that it relies on social engineering - meaning it's designed to trick users - and it can potentially be used to compromise any online account. Unfortunately, we humans are both easy to trick - at least some of the time - as well as fallible. And attackers can pummel would-be victims with phishing attacks until one succeeds. The scale of the phishing challenge is reflected by the number of video interviews touching on phishing that I recently conducted at the London Infosecurity Europe conference. Experts described everything from the increasingly targeted nature of phishing attacks and the importance of never forgetting the human factor as well as training users, using technology to extract data from emails and attachments and implementing the practice of tracking malicious domains to better block phishing campaigns. But as this patchwork of practices, procedures and technology demonstrates, there's no single fix for the phishing problem. Furthermore, with more of our business and personal lives now living in the cloud, the impact of falling victim to a phishing attack continues to increase.



Privacy pros gaining control of technology decision-making over IT

“This global survey is critical in our efforts to better understand how privacy professionals are addressing compliance challenges and the technologies that are being deployed now and in the near future,” said Chris Babel, CEO of TrustArc. “Though security budgets remain larger, we’re seeing a marked shift in privacy teams’ influence over technology purchasing decisions. This trend confirms what we’re seeing among our customers – that they have a growing need for technology solutions to help them manage privacy compliance at scale on a global basis.” The EU GDPR and other global and domestic legal reforms, combined with technological advancements, have made the task of operationalizing privacy and data protection vastly more complicated. Businesses now must account for how data is entering the organization, how it is being used, what permissions are attached to it and who has the responsibility for managing it. To address these challenges, the demand for privacy technology continues to grow rapidly.


Measuring Tech Performance: You’re Probably Doing It Wrong


First, velocity is a relative and team-dependent measure, not an absolute one. Teams usually have significantly different contexts which make comparing velocities inappropriate. (Seriously, don’t do this.) Second, when velocity is used as a productivity measure, teams are very likely to game it: they inflate their estimates and focus on completing as many stories as possible at the expense of collaboration with other teams (which might decrease their velocity and increase the other team's velocity, making them look bad). Not only does this destroy the utility of velocity for its intended purpose, it also inhibits collaboration between teams. Velocity as a productivity metric violates our guidelines by focusing on focusing on local measures and not global measures. This is particularly obvious in the second critique above: by (understandably) making choices to optimize their own velocity, teams will often not collaborate with other teams. This often results in scenarios where subpar solutions are available to the organization because there isn't’ a focus on global measures.


How to spot bad data, and know the limitations when its good

A 2016 survey of CEOs found 84 percent of them felt concerned about the quality of data they used while making decisions. And they have valid reasons for feeling wary — bad data could cause financial repercussions if business leaders put too much trust in material that’s ultimately lacking. It’s also crucial to consider the wasted time from bad data. When professionals engage in data-driven marketing, they may be relying on content filled with non-human influences such as bots or malware. If that happens, they could get false perceptions of customers’ journeys at websites or the factors that cause them to linger on certain pages versus others. There are reputational risks, too. If a company releases public research that later gets proven inaccurate, it’ll be difficult for that entity to encourage trust in future material. When business leaders blindly trust data — especially when making decisions — they inevitably set the stage for problems. Staying aware of the characteristics of bad data discussed here is an excellent first step in being proactive.


Law firms failing to meet their client’s digital expectations, according to study

Law firms failing to meet client̢۪s digital expectations image
Martin Flick, CEO of Olive Communications, said: “Today’s busy, always on and mobile first consumer wants to buy goods and services, and communicate with sellers whenever, wherever, and however they choose.” “Increasingly this is through digital interaction. When it comes to their lawyer or solicitor, they want to engage in the same way, without the frustration of having to wait days for paper documents to arrive in the post or for an email to come through with the answer to a question that could be easily resolved with an instant message or automated response.” “Consumers want more control over their legal affairs with sometimes, little or no human intervention, and with the speed, efficiency, and security that multiple channel web-based communications offer.” The study found that a significant portion of law firms are embracing new technology internally, for example, 69% are using IM and chat to communicate with each other. However, few of these firms are extending the use of technology externally to enhance the client experience.


Backup best practices: A NAS is not enough

The idea of 3-2-1 is to have three copies of every file, two of which are on different physical devices, and one of which is located off-site. Our guy didn't have that. He counted entirely on one NAS for all his backups. He has an offsite backup, but it hadn't been updated.The "off" part of my strategy is to have at least one full backup air-gapped from the Internet. I do this for my stuff by keeping one backup server shut down, except for a once a week quick incremental backup nibble ... The point of this article, though, is to remind you of the 3-2-1-off-and-away strategy and to not be dumb. A single NAS as your backup strategy is not enough. As a rule, I have two NAS boxes running all the time. One is my hot, live working environment. Another is an offline backup. In my case, I was fortunate that the ioSafe folks sent me their flood-and-fire-proof ioSafe 1515+, so my backup NAS isn't just a second NAS, it's an armored bomb-proof bunker of a backup NAS. At some point in the future, I'll take you through my whole storage architecture.



Quote for the day:


"You may not control all the events that happen to you, but you can decide not to be reduced by them." -- Maya Angelou


Daily Tech Digest - July 23, 2018

Most of AI’s Business Uses Will Be in Two Areas


The business areas that traditionally provide the most value to companies tend to be the areas where AI can have the biggest impact. In retail organizations, for example, marketing and sales has often provided significant value. Our research shows that using AI on customer data to personalize promotions can lead to a 1-2% increase in incremental sales for brick-and-mortar retailers alone. In advanced manufacturing, by contrast, operations often drive the most value. Here, AI can enable forecasting based on underlying causal drivers of demand rather than prior outcomes, improving forecasting accuracy by 10-20%. This translates into a potential 5% reduction in inventory costs and revenue increases of 2-3%. While applications of AI cover a full range of functional areas, it is in fact in these two cross-cutting ones—supply-chain management/manufacturing and marketing and sales—where we believe AI can have the biggest impact, at least for now, in several industries. Combined, we estimate that these use cases make up more than two-thirds of the entire AI opportunity.



How SD-WAN Will Make The Cloud Much Much Bigger

cloud balloon inflate cloud computing grow big blow up
The need to be connected to the mother ship is what brings the Cloud into its meaningful existence because we live and work at the edges of the Cloud. SD-WAN is not just a market but a platform as well that will eventually evolve into user-defined WAN (UD-WAN). To clarify, the term applies to enterprise users and not consumers. And the purpose of SD-WAN is to connect and fully integrate the very edges of the enterprise – be it corporate headquarters, branch/remote offices or the mobile millions. In other words, us, the users. But if we look at the concept of the cloud it is pretty clear that it is referenced in an abstract form. After all what is this cloud thing? Some physical space in a non-descript windowless warehouse? Without its tentacles, the cloud is nothing more than a collection of computers, storage and cooling systems created by geeks and for what purpose? It is those very tentacles in the form of wide-area networks (WAN) that give the Cloud its purpose. And given the explosive adoption of cloud-based applications (Box, Dropbox, Salesforce, SAP, Slack, etc.) cloud computing is not a fad, it is here to stay. However, that is just the beginning.


The value of superior UX? Priceless, but awfully hard to measure

The problem, Cooper continues, is that managers and executives outside of the bubble remain skeptical about investing any more than they have to in UX -- to them, it's a dark art. So, they ask: "What is the ROI of UX?" Asking about ROI, of course, is a manager's way of expressing doubts. "They aren't seeking enlightenment," Cooper says. ... In UX design, he continues "ROI is often about eliminating poor design." Some industry specialists have attempted to put a monetary value on superior UX design. A recent report from CareerFoundry estimates that UX design work delivers a 100-fold return on investment, without even counting the soft benefits. Every $1 investment in UX translates to returns of at least $100 dollars, the report's authors illustrate -- mainly through e-commerce and customer-facing interactions. Add to this the softer, but just as important, ancillary benefits: "fewer support calls, increased customer satisfaction, reduced development waste, and lower risk of developing the wrong idea."


Why techmatters – the challenge for everyone in the UK tech community


If there is a magic recipe for digital innovation, then the UK surely has all the ingredients. We have created and attracted some of the world’s best and most diverse digital talent. We have world-leading businesses, universities and powerful ecosystems that enable expertise to spill over from one part of the economy to another. In almost every sector, I can point to world leaders on the cutting edge of digital transformation. Above all, we have ambition and we have each other. What sets us apart from any other country is that in the UK technology community, we stand on the shoulders of each other. But to really thrive, three things are important. We must stay focused on making tech work for people and our economy. We must not underestimate our international competitors. And, perhaps most importantly, we must accept the enormous responsibility that comes with developing powerful technology. We do have great people in this sector – but we simply don’t have enough of them. And we don’t have the depth of skills and talent that the economy needs as a whole. This, surely, is our biggest challenge.


Why Artificial Intelligence Is Not a Silver Bullet for Cybersecurity

While AI is likely to work quite well over a strictly controlled network, the reality is much more colorful and much less controlled. AI's Four Horsemen of the Apocalypse are the proliferation of shadow IT, bring-your-own-device programs, software-as-a-service systems, and, as always, employees. Regardless of how much big data you have for your AI, you need to tame all four of these simultaneously — a difficult or near-impossible task. There will always be a situation where an employee catches up on Gmail-based company email from a personal laptop over an unsecured Wi-Fi network and boom! There goes your sensitive data without AI even getting the chance to know about it. In the end, your own application might be protected by AI that prevents you from misusing it, but how do you secure it for the end user who might be using a device that you weren't even aware of? Or, how do you introduce AI to a cloud-based system that offers only smartphone apps and no corporate access control, not to mention real-time logs? There's simply no way for a company to successfully employ machine learning in this type of situation.


Unsecured server exposes 157 GB of highly sensitive data from Tesla, Toyota and more

data breach, Level One, Tesla, Toyota, Ford
The unsecured trade secrets and corporate documents had been exposed via the file transfer protocol rsync. UpGuard wrote, “The rsync server was not restricted by IP or user, and the data set was downloadable to any rsync client that connected to the rsync port. The sheer amount of sensitive data and the number of affected businesses illustrate how third- and fourth-party supply chain cyber risk can affect even the largest companies. The automation and digitization of manufacturing has transformed the industry, but it has also created a new area of concern for industries, and one that must be taken seriously for organizations to thrive in a healthy digital ecosystem.” Not only could anyone connect to Level One’s rsync server, but it was also “publicly writable, meaning that someone could potentially have altered the documents there, for example replacing bank account numbers in direct deposit instructions, or embedding malware.” The exposed rsync server was discovered on July 1. Attempts to contact Level One started on July 5, but contact wasn’t established until July 9. The exposure was closed within a day, by July 10.


Organizations Need IT Experts Who Know Basic LAN/WAN Switching and Routing

Responsiveness, security, and reliability are the new hallmarks of networking. Automation, analytics, IoT, policy-based network management, programmability, and virtualization are enabling these changes. The technologies, and the ways they’re being applied, are new. So, IT and networking professionals need new skills to make them work for businesses. In order to appeal to hiring managers, boost their careers, and bring greater value to employers there are fundamental skills that IT and networking professionals need. At a very fundamental level, it’s critical that IT experts know the basics of LAN and WAN switching and routing. These skills will help network engineers configure, verify, troubleshoot, and secure today’s networks. In addition, the evolution of the network creates a growing need for IT professionals who can implement and manage software-centric networks. This involves using APIs, controllers, policies, and virtualization. These technologies and tools allow for greater automation, network intelligence, and agility.


The Engineer’s guide to the future


If AR is hyped, AI is basically the buzzword of the century. Lots of people aren’t really sure what it means, but they know it’s important and that their business needs it. The first thing to know is that modern day Artificial Intelligence doesn’t actually mean a computer being intelligent — it’s basically a catch-all term for computer programs that can “learn”, to improve their operational efficiency or their success. Even at that, lots of applications that say they use AI actually don’t. A chatbot that has a big decision tree in the background isn’t AI, it’s just a big decision tree. If you ask “What is Ragnarok?” and get back the answer “It is simultaneously a great action movie and the ruin of a good character” — it’s probably not artificial intelligence, just quite wise. However, there is plenty of amazing work being done with proper AI and Machine Learning, for a whole heap of use-cases. We don’t need a crystal ball to say that knowing about AI will be beneficial for a future engineering career. Similar to Apple and Google releasing tools to “democratise” Augmented Reality development, each year there are more tools available to enable developers to build AI solutions


In the wake of GDPR, college IT security programs need to evolve

While U.S. universities who offer information security programs typically cover a range of compliance concepts related to U.S. regulations, such as the Health Insurance Portability and Accountability Act (HIPAA) or Sarbanes-Oxley (SOX), the GDPR is something of a game changer because it is not a regulation enacted by a U.S. agency, yet it requires compliance on the part of U.S. entities. The GDPR is only the first of several proposed global regulations governing data privacy. Before 2015, data exchanges between the U.S. and the EU were governed by the Safe Harbor program which allowed the personal data of EU citizens to be exchanged with U.S. providers as long as both sides of the transaction complied loosely with the EU Data Protection Directive. The directive wasn’t as tightly defined as the GDPR and lacked teeth in the form of significant fines or penalties. As a result, up to this point in time, U.S. businesses have not had to unduly concern themselves with regulations enacted outside U.S. borders. GDPR demands a change in that mindset.


Can businesses use blockchain to solve the problem of data management?

Can businesses use blockchain to solve the problem of data management? image
Since the nodes are distributed and operate peer-to-peer, the possibility of bottleneck formation is nonexistent. One of the most important features of blockchain systems, however, is immutability: once an entry is appended to the database, it cannot be removed. Using blockchain for databases seems like a logical step forward. There’s definitely an emerging movement seeking to lay the foundations for a decentralised architecture across industries. With blockchain, a marketplace akin to AirBnB or Uber can materialise for storing data – nodes on the network can be incentivised to replicate and retain information using a blockchain protocol’s inbuilt payment layer. This concept can be taken a step further with the use of sharding and swarming. Sharding offers a greater degree of privacy whereby, instead of sending a file to other nodes, you distribute fragments of said file. In this way, the owner can be sure that those in possession of their data cannot access it, as they will only hold a small (and unreadable) piece – much like torrenting.



Quote for the day:


"Authentic leaders are not afraid to make mistakes, but they fix them faster than they make them." -- George Bernard Shaw


Daily Tech Digest - July 22, 2018

nullBy reducing manual intervention, automated processes can minimise mistakes and human error – but there is still the chance that something can go wrong. Designers of automated processes need to ensure that the appropriate quality outcomes are being measured and assessed against a given specification. Importantly, this must happen throughout the entire process. Let’s think about the car production line again. The cost of finding out that something went wrong at the start of the production process after the car has been built is significant. Instead, process designers will want to identify errors quickly and allow the process to make the necessary changes to ensure a quality product is delivered.  A significant quantity of data is generated through automated processes, but the quantity of data does not compensate for the quality of the data. In order to deliver a quality product at the end of an automated process, a quality data management process is critical. But what is bad data? And, if everything is being automated anyway, why should we care?


Python has brought computer programming to a vast new audience


Not all Pythonistas are so ambitious, though. Zach Sims, Codecademy’s boss, believes many visitors to his website are attempting to acquire skills that could help them in what are conventionally seen as “non-technical” jobs. Marketers, for instance, can use the language to build statistical models that measure the effectiveness of campaigns. College lecturers can check whether they are distributing grades properly. For professions that have long relied on trawling through spreadsheets, Python is especially valuable. Citigroup, an American bank, has introduced a crash course in Python for its trainee analysts. A jobs website, eFinancialCareers, reports a near-fourfold increase in listings mentioning Python between the first quarters of 2015 and 2018. The thirst for these skills is not without risk. Cesar Brea, a partner at Bain & Company, a consultancy, warns that the scariest thing in his trade is “someone who has learned a tool but doesn’t know what is going on under the hood”. Without proper oversight, a novice playing with AI libraries could reach dodgy conclusions.


Top 10 Data Science Use Cases in Insurance


The customers are always willing to get personalized services which would match their needs and lifestyle perfectly well. The insurance industry is not an exception in this case. The insurers face the challenge of assuring digital communication with their customers to meet these demands. Highly personalized and relevant insurance experiences are assured with the help of the artificial intelligence and advanced analytics extracting the insights from a vast amount of the demographic data, preferences, interaction, behavior, attitude, lifestyle details, interests, hobbies, etc. The consumers tend to look for personalized offers, policies, loyalty programs, recommendations, and options. The platforms collect all the possible data to define the major customers` requirements. After that, the hypothesis on what will work or won`t work is made. ... Modern technologies have brought the promotion of products and services to a qualitatively new level. Different customers tend to have specific expectations for the insurance business. Insurance marketing applies various techniques to increase the number of customers and to assure targeted marketing strategies. In this regard, customer segmentation proves to be a key method.


The Evolution Of Data


Traditionally, a platform was used to address an enterprise process workflow — human resources (HR), finance, manufacturing, etc. They are what we categorize as enterprise resource planning (ERP), customer relationship management (CRM), human capital management (HCM), functional setup manager (FSM), information technology operations (ITOps), etc. The data generated by these workflows was then analyzed using analytics or business intelligence applications to make further modifications to workflow. These workflow applications were customized as the data warranted any changes in the workflow. ... The workflow actions will be passed on to the traditional applications or directly to the people or system that will perform the actions. These new systems of intelligence will emerge and will force existing workflow applications to change to be end-user targeted. We are already seeing a trend where AI platforms are slowly becoming a playground for new intelligent applications. More importantly, because open source intelligent platforms in this area are as rich as the enterprise platforms, we are also noticing new generations of applications.


6 trends that are changing the face of UX

An flat, vector-style illustration showing various elements of UX including wireframing and featuring services whose UX is being improved, such as social media companies and driverless cars.
A pattern library acts as a centralised hub for all components of the user interface. Effective pattern libraries provide pattern descriptions, annotations and contextual information. They also showcase the code and pattern variations, and have the ability to add real data into the pattern structure. Once a design system is up and running, it’s only the first step in the journey. It needs to be living. Nathan Curtis, a co-founder of UX firm EightShapes, says: “A design system isn’t a project. It’s a product, serving products.” Like any good product, a design system needs maintenance and improvements to succeed. Both Google and Salesforce have teams dedicated to improving their design systems. The goal is a workflow where changes to the design system update the documentation and the code. The benefits realised by a thoughtful, unified design system outweigh the effort involved in establishing one. There is a consistency across the entire user experience. Engineers and designers share a common language and systems are more sustainable. Designers can spend their time solving harder problems and the actual user experience.


What’s so special about 5G and IoT?

5G mobile wireless network
If we think about our current needs for IoT, what we care about are three things: price, coverage, and lower power consumption. But 5G is focused on increasing bandwidth, and while increased data transfer and speeds are nice, they are not entirely necessary for IoT products. The GSMA outlines 5G will possibly offer 1000x bandwidth per unit area. However, as they state in their own report, bandwidth per unit area is not dependent upon 5G, but more devices connecting with higher bandwidths for longer durations. While it is great that 5G aims to improve this service, the rollout of LTE has already had a significant effect on bandwidth consumption. We should be excited about continued incremental improvements on Cat-M1 and NB-IoT as we get even lower cost and lower power solutions for our IoT applications. Unlike LTE, 5G lacks a solid definition, which means cellular providers could eventually label a slightly-faster-than-LTE connection as 5G. And truly, the only thing that is certain about 5G is we won’t know what it can and cannot do until it arrives. 


Microsoft's Linux love-in continues as PowerShell Core comes to Ubuntu Snap Store

Evidence of that newfound affection has been evident throughout 2018: with Ubuntu 18.04 being made available in the Microsoft Store, Windows File Explorer gaining the ability to launch a Linux Shell and a new option to install Windows Subsystem for Linux (WSL) distros from the command line. That's without mentioning Microsoft's release of the Linux-based Azure Sphere operating system. Now Microsoft has released its command-line shell and scripting language PowerShell Core for the Ubuntu Snap Store, as part of PowerShell Core's release as a snap package. Snap packages are containerized applications that can be installed on many Linux distributions, which Joey Aiello, PM for PowerShell at Microsoft, says has several advantages. "Snap packages carry all of their own dependencies, so you don't need to worry about the specific versions of shared libraries installed on your machine," he said, adding updates to Snaps happen automatically, and are "safe to run" as they don't interact with other applications or system files without your permission.


Why Design Thinking Should Also Serve As A Leadership Philosophy

The key here, from a leadership standpoint, is simply to drop the ego. Sweep aside titles and preconceptions about where audience insight should come from. Instead of defaulting to traditional techniques for collecting customer insight, seek it out wherever you can. Find the people who are best equipped to provide an insider's look at your customers' preferences and dislikes, whether those people are sitting in a focus group or across from you on the subway, so you can be sure you'll be giving your customers exactly what they want. Adopting a human-centric mindset can help you turn even the most fragmented experiences into seamless interactions between customer and brand. It's an investment in the customer journey that can build long-term loyalty and trust. Often, dissecting the user experience also reveals new product markets, audience segments and customer service platforms that can lead to future growth. When you consider what's at the heart of your business problem and break down the barriers between your company and your customers, it quickly becomes clear that design thinking can alter your leadership approach for the better.


Managing Engineering Complexity: Are You Ready?

So, here is the complexity loop we are in: customers demanding more capabilities leads to more complexity in IoT systems, constantly feeding data into the development processes, leading to new security and safety standards requirements, new use cases, and the need to adapt fast to changes that companies cannot always predict. These actions lead to the demand for even more complex IoT systems. And, with these new changes, new customers’ demands arise and the loop continues perpetually. Let’s zoom in for a second and see what that means for one of the most exciting industries today – automotive engineering, i.e., how we build a car. What characterizes the OEM leaders today is the desire for speed in product development and a capability of overcoming the complexity of connecting requirements, design, development, validation, and deployment within their engineering process and throughout their supply chain. And how they do that?


The Role of Randomization to Address Confounding Variables in Machine Learning

Machine learning practitioners are typically interested in the skill of a predictive model and less concerned with the statistical correctness or interpretability of the model. As such, confounding variables are an important topic when it comes to data selection and preparation, but less important than they may be when developing descriptive statistical models. Nevertheless, confounding variables are critically important in applied machine learning. The evaluation of a machine learning model is an experiment with independent and dependent variables. As such, it is subject to confounding variables. What may be surprising is that you already know this and that the gold-standard practices in applied machine learning address this. ... Randomization is a simple tool in experimental design that allows the confounding variables to have their effect across a sample. It shifts the experiment from looking at an individual case to a collection of observations, where statistical tools are used to interpret the finding.



Quote for the day:

"Leaders must be good listeners. It_s rule number one, and it_s the most powerful thing they can do to build trusted relationships." -- Lee Ellis

Daily Tech Digest - July 21, 2018


If we are looking for traffic spikes to detect an attack, how do we know it’s actually an attack and not simply a busy time for the business? Maybe our business is experiencing the result of a successful marketing campaign that had pushed more buyers to our website. On the flip side, maybe the hacker is using a low and slow attack on a backend server that takes out that service, but never creates a large enough traffic spike to detect an issue. The answer is that traditional detection methods (rate-based) will not offer you certainty in dealing with next-generation DDoS issues. When you rate-limit traffic you are blocking some good traffic from reaching its destination, and you are allowing some bad traffic into your network. Yes, you stay up, but your users may still experience some discomfort. Traditional devices are not able to discriminate bad from good so anything over a certain threshold gets dropped. These devices lack specificity. Why would I not keep all of my good traffic and simply drop the bad?




Artificial intelligence has great potential in the realm of economics and business. It not only relieves workers from having to do repetitive or even dangerous tasks, it is also much faster in analyzing data volumes, making decisions based on this, and completing tasks. What’s more, robots will further automate production, which will open many new doors. For example, countries such as Germany will become a more attractive production location, thus increasing its competitiveness. There will no longer be any economic reasons for outsourcing production to low-wage countries. Whole new business areas will emerge as a result of AI joining up with connected products, processes and machines. AI is developing more and more into a disruptive core technology. It will revolutionize our working lives and current software applications! Just like humans, machines are also capable of making mistakes. As long as human health, life and death are not at stake or people are not being assessed, mistakes are acceptable. Using a percentage tolerance level, we humans will define probabilities which allow us to decide if a computation is correct.



Machines Teaching Each Other Could Be the Biggest Exponential Trend

Intelligent systems, like those powered by the latest round of machine learning software, aren’t just getting smarter: they’re getting smarter faster. Understanding the rate at which these systems develop can be a particularly challenging part of navigating technological change. Ray Kurzweil has written extensively on the gaps in human understanding between what he calls the “intuitive linear” view of technological change and the “exponential” rate of change now taking place. Almost two decades after writing the influential essay on what he calls “The Law of Accelerating Returns”—a theory of evolutionary change concerned with the speed at which systems improve over time—connected devices are now sharing knowledge between themselves, escalating the speed at which they improve. “I think that this is perhaps the biggest exponential trend in AI,” said Hod Lipson, professor of mechanical engineering and data science at Columbia University, in a recent interview.


Artificial Intelligence : The Potential For Better Corporate Governance

Artificial Intelligence : The Potential For Better Corporate Governance
With banks in emerging markets with strong ties to US investments, investors and regulators are interested in these ratings. "Corruption and the erosion of trust in many of these countries are among the biggest challenges banks face doing business there” said Ms Haddad. "Risk is evolving and credit risk only gives you part of the picture. Risk associated with counterparty illicit finance and conduct risk is equally – if not more – important for companies operating in complex markets. For most responsible boards we talk to, it's a top three issue" said Mr Jones, Sigma CEO. "What is unique about the Sigma Ratings team is that they saw a growing problem from inside the corridors of government and international development finance, and found an innovative way to marry their unique knowledge with cutting-edge technology," said Gareth Jones, Managing Partner at FinTech Collective. "We are excited to partner with Sigma Ratings as it changes how financial institutions around the world consume and apply risk information to conduct business" he added, describing the launch as "a game changer in today's environment where illicit finance is directly related to issues of economic development and national security."


What the Incident Responders Saw

Close to 60% of attacks involve lateral movement, or where the attacker travels from its initial victim machine to other machines in a targeted organization. PowerShell is one of the most popular tools for moving about the victim's network: 100% of IR pros say they've seen the Microsoft Windows automation and configuration management tool employed by attackers, and 84% see Windows Management Interface (WMI) as a key tool weaponized by attackers. This so-called "living off the land" approach of running legitimate tools to remain under the radar is classic behavior of persistent hacker teams such as nation-states. Some 54% of IR pros say legit operating system applications like these are being abused by attackers. In addition, 16% have spotted attackers running Dropbox to assist in their movements. "The uptick of WMI is concerning," notes Kellermann, as well as the use of process-hollowing and unsigned digital certificates. "It speaks to the level of sophistication [being used] to colonize that infrastructure."


Chaos is needed to keep us smart with Machine Learning


The “relative noise reduction” paradigm of AI is reducing the outliers that you see in your interactions with the technology. However, the noise that you don’t see is somehow reducing your frontiers. As a data scientist, I am happy to see my algorithm achieve high accuracy to predict, to recommend well and to self-correct. But these algorithms learn from what they see, exactly or partially bearing resemblance to the history. They do not and cannot propose things my brain would propose in exploratory process without any proof of obvious interest from history. And that brings us to a “side effect” of AI- the unconscious trimming of creativity. I have my interest to watch astronomy related documentaries, and google, Facebook, Instagram, YouTube, LinkedIn and even Pinterest seems to be aware of this. However, somehow, a chain reaction has triggered that takes me to a gazillion resources around astronomy. How are we allowing the users of the AI-powered technology to broaden their horizons by showing them something absolutely out of the blue once in a while? How are we ensuring that the power of human nature and ability to learn is embedded in the recommendations we design?


Hyper-Localization Is Key To FinTech Expansion

“Some of the major investments we’re doing is localizing our services and opening offices worldwide to cater our value prop to the local businesses. A Chinese exporter’s needs are different than the needs of exporters in other areas of Asia, for example,” she says. The company has partnered with Chinese giants WeChat Pay and AliPay to try to increase acceptance rates in Western countries through Chinese mobile wallets. In this partnership, Chinese consumers who are traveling abroad and would like to purchase a good in store or online can pay using their Chinese mobile wallet and it will be accepted in the international store they are trying to buy from. They also made waves in 2016 with their Rakuten partnership. Payoneer’s first major client was Getty Images back in 2007, which used their services to pay photographers around the globe. As their early clients have grown into major companies, Payoneer has grown up with them. “We’ve constantly delivered more services to address their needs. We also became more licensed and more regulated, which gave us more credibility for a small business back then,” says Levy


How Artificial Intelligence will Transform IT Operations and DevOps


While Artificial Intelligence (AI) used to be the buzzword a few decades ago, it is now being commonly applied across different industries for a diverse range of purposes. Combining big data, AI, and human domain knowledge, technologists and scientists have become able to create astounding breakthroughs and opportunities, which used to be possible in science fiction novels and movies only. As IT operations become agile and dynamic, they are also getting immensely complex. The human mind is no longer capable of keeping up with the velocity, volume, and variety of Big Data streaming through daily operations, making AI a powerful and essential tool for optimizing the analyzing and decision-making processes. AI helps in filling the gaps between humans and Big Data, giving them the required operational intelligence and speed to significantly waive off the burden of troubleshooting and real-time decision-making. ... To identify that single log entry putting cracks in the environment and crashing your applications, wouldn’t it be easy if you just knew what kind of error you are looking for to filter your log data?


The future of banking


The technology that ties decentralized and open banking together is the blockchain. With the blockchain accelerating the potential to realize the benefits of these approaches to banking, the industry is touting the blockchain as the key to a future where banking is more accessible. Open and decentralized banking is enabling companies like HSBC to work with startups like R3 to put the blockchain to the test. Projects like WeTrust, Kiva, and OPTIX are working on blockchain-based solutions to address specific issues in money management, lending, and personal finance. ... The goal is to make existing online services more accessible through blockchain technology. For the unbanked, this means being able to get a loan, send remittance, or cover medical expenses just with their smartphone or walking over to their local neighborhood store. This is “universal”: decentralizing these services to make them within reach for anyone who needs them. Imagine a world where it doesn’t matter if you’re from Nigeria or Bangladesh, from the city or a small town.


Digital transformation in logistics: Delmar begins tech overhaul


The Rackspace agreement represents a shift in IT strategy for the transportation and logistics firm. "We have always taken the approach that we have to own everything," said Ron McIntyre, CTO at Delmar. But the task of managing hardware, software and data globally -- Montreal-based Delmar has operations in North America, Asia and Latin America -- became difficult. And McIntyre said the company's existing infrastructure isn't scalable enough to react quickly to business events such as an acquisition or a new market venture. The Rackspace managed services deal is part of a modernization journey in which Delmar aims to position itself to become "much more competitive and efficient within our marketplace" in the next few years, McIntyre said. In that marketplace, established providers such as Delmar face challenges from startups and emerging technologies -- hence the need for digital transformation in logistics. "There are startups around the Uber model in terms of matching shippers with carriers," McIntyre said. "That is where the push is coming from."



Quote for the day:


"Don't waste your time with explanations: people only hear what they want to hear." -- Paulo Coelho


Daily Tech Digest - July 20, 2018

An an individual, I can demand to receive my personal data from a supplier, and I can demand that the supplier then deletes all the personal information they have on me (subject to legal constraints – e.g. companies have a right to keep prior billing information, since it’s an statutory part of the accounting record). This is far from being an agreed-upon full “legal ownership right” to the information, but (as the famous saying goes), possession is nine-tenths of the law. As any economist can tell you, ownership rights have a profound impact on how markets are structured — so this might just be the start of a truly fundamental change to the entire industry. Revenge. What can you do today if you have an awful customer experience with a company? You can complain directly to the company, or to the world in general on social media. But you now have another potent weapon: you can demand that the company gives you all your personal data, and then exercise your “right to be forgotten”. Given the complex nature of computer systems in most organizations, this is currently likely to be a very manual and expensive process for the companies involved. 


How to control state for so-called stateless microservices


Front-end state control is suitable if the transactions are handled by a web or mobile GUI/app, and this front-end server controls the sequence of steps being taken. Sometimes, the front-end process can make truly stateless requests to microservices. When it doesn't, the front end can provide the state as part of the data it sends to the microservice, and the microservice can then adjust its processing based on that state. This approach doesn't add any complexity or processing delay to the app's design. Back-end state control is the more complex approach to take with stateless microservices, from developmental and operational perspectives. With back-end state control, the microservice maintains a database of state information, and it accesses that database when it has to process a message. If the microservice supports numerous transactions, a problem arises because it must determine which back-end database record corresponds with the current message. Sometimes, a transaction ID, timestamp or other unique identifier is provided for logging and can be used for state control as well.


Switzerland seeks to regain cryptocurrency crown


Swiss banks are urging the authorities to give them more clarity on the rules that apply to cryptocurrency projects before providing services to the market, and at least two important players have withdrawn for now. Zuercher Kantonalbank (ZKB), the fourth largest Swiss bank and one of the few big banks in the world to welcome issuers of cryptocurrencies, has closed the accounts of more than 20 companies in the last year, industry sources told Reuters. A spokesman for ZKB declined to comment on any former or existing clients relationships, but said the bank does not do business with any cryptocurrency groups. Another large Swiss bank kicked out crypto project Smart Valor at around the same time, said a person familiar with the project. The source declined to name the bank. Only a handful of Switzerland’s 250 banks ever allowed companies to deposit the cash equivalent of cryptocurrencies raised in ICOs. At least two still do, Reuters has established. But the involvement of a large bank like ZKB helped to establish Switzerland as an early cryptocurrency hub.


Tracking Disinformation by Reading Metadata


When we read metadata that’s been exploited or gamed in social media platforms as data craft, we can decode the signals and noise found in automated disinformation campaigns. Data craftwork not only gives us insight into the emerging techniques of manipulators, it is also a way of understanding the power structures of platforms themselves, a means of apprehending the currents and flows of personalization algorithms that underwrite the classification mechanisms that now structure our digital lives. But before we can understand how metadata categories are harnessed and hacked, it’s necessary to have a fuller picture of what platform metadata is, how it is encoded and decoded, and how it is created and collected for use by a range of actors — from technologists and providers, to individual users, to governments, to media manipulators. Currently, there is a range of known manipulation tactics for gaming engagement data. Social media professionals are known to inflate engagement by increasing likes, views, follower counts, and comments for profit.


Justice Department unveils strategy to fight election meddling, cybercrime

In the encryption section, DOJ notes that it cannot rely solely on purchasing workarounds like Cellebrite or GrayKey. “Expanding the government’s exploitation of vulnerabilities for law enforcement purposes will likely require significantly higher expenditures — and in the end it may not be a scalable solution,” the report warns. “All vulnerabilities have a limited lifespan and may have a limited scope of applicability.” Another problem relevant to election security is that the Computer Fraud and Abuse Act only empowers DOJ to prosecute people who hack internet-connected devices. “In many conceivable situations, electronic voting machines will not meet those criteria, as they are typically kept off the Internet,” the report notes. “Consequently, should hacking of a voting machine occur, the government would not, in many conceivable circumstances, be able to use the CFAA to prosecute the hackers.” At the Aspen event, Rosenstein said the report underscored how DOJ “must continually adapt criminal justice and intelligence tools to combat hackers and other cybercriminals.”


The cashless society is a con – and big finance is behind it

An ATM in west London
A cashless society brings dangers. People without bank accounts will find themselves further marginalised, disenfranchised from the cash infrastructure that previously supported them. There are also poorly understood psychological implications about cash encouraging self-control while paying by card or a mobile phone can encourage spending. And a cashless society has major surveillance implications. Despite this, we see an alignment between government and financial institutions. The Treasury recently held a public consultation on cash and digital payments in the new economy. It presented itself as attempting to strike a balance, noting that cash was still important. But years of subtle lobbying by the financial industry have clearly paid off. The call for evidence repeatedly notes the negative elements of cash – associating it with crime and tax evasion – but barely mentions the negative implications of digital payments. The UK government has chosen to champion the digital financial services industry. This is irresponsible and disingenuous. We need to stop accepting stories about the cashless society and hyper-digital banking being “natural progress”. 


Putting the promises of artificial intelligence to the test


To ensure valid, reliable, safe and ethical AI decision-making, we therefore need to develop robust approaches to teaching and training AI applications. This calls for a new testing regime, tailor-made for AI applications, that ensures adequate transparency in the decisioning mechanism in a way that users can understand; and provides an assurance of fairness and non-discrimination in the decision process. The key challenge in developing such a testing regime is that AI software has many moving parts. When testing AI applications, engineers must consider many variables which include processing unstructured data, managing the variety and veracity of data, the choice of algorithms, evaluating the accuracy and performance of the learning models and ensuring ethical and unbiased decisioning by the new system along with regulatory and compliance adherence. New testing and monitoring processes which account for the data-dependent nature of these systems also need to be developed. One way to break down the development and validation requirements for AI is to divide the work into two stages. The first stage is the ‘Teach’ stage, where the system is trained to produce a set of outputs by learning patterns in training data through various algorithms.


Are you scared yet? Meet Norman, the psychopathic AI

Norman, psychopathic AI
The program flagged that black people were twice as likely as white people to reoffend, as a result of the flawed information that it was learning from. Predictive policing algorithms used in the US were also spotted as being similarly biased, as a result of the historical crime data on which they were trained. Sometimes the data that AI "learns" from comes from humans intent on mischief-making so when Microsoft's chatbat Tay was released on Twitter in 2016, the bot quickly proved a hit with racists and trolls who taught it to defend white supremacists, call for genocide and express a fondness for Hitler. ... "When we train machines by choosing our culture, we necessarily transfer our own biases," she said. "There is no mathematical way to create fairness. Bias is not a bad word in machine learning. It just means that the machine is picking up regularities." What she worries about is the idea that some programmers would deliberately choose to hard-bake badness or bias into machines. To stop this, the process of creating AI needs more oversight and greater transparency, she thinks.


UK alerted to potential cyber risks of Huawei equipment


The report said Huawei is failing to follow agreed security processes around the use of third-party components. “In particular, security critical third-party software used in a variety of products was not subject to sufficient control.” ... A company spokesman said: “We are grateful for this feedback and are committed to addressing these issues. Cyber-security remains Huawei's top priority, and we will continue to actively improve our engineering processes and risk management systems.” The report said the National Security Adviser Mark Sedwill had been alerted to the issues in February and that work continues to remediate the engineering process issues in other products that are deployed in the UK, prioritised based on risk profiles and deployment volumes. “This work should give us the ability to provide end-to-end assurance that the code analysed by HCSEC is the constituent code used to build the binary packages executed on the network elements in the UK,” the report said, adding that until this work is completed, the Oversight Board can offer only limited assurance due to the lack of the required end-to-end traceability from source code examined by HCSEC through to executables use by the UK operators.


The reasons are simple. Reactive maintenance work costs four to five times as much as proactively replacing worn parts. When equipment fails because there is a lack of awareness of degraded performance there are immediate costs as a result of lost productivity, inventory backup, delays in completing the finished product, and more. A study by The Wall Street Journal and Emerson reported that unplanned downtime, which is caused 42% of the time by equipment failure, amounts to an estimated $50 billion per year for industrial manufacturers. Even after production begins again, the costs of interrupting operations continue. According to the Customers‘ Voice: Predictive Maintenance in Manufacturing report by Frenus, approximately 50% of all large companies face quality issues after an unplanned shutdown. In addition to savings, predictive maintenance can also result in competitive differentiation. When machine data can be used to perform predictive maintenance with a high level of precision, manufacturers can focus on differentiating products using digital capabilities like self-healing based on an awareness of technical health.



Quote for the day:


"Do not follow where the path may lead. Go instead where there is no path and leave a trail." -- Muriel Strode