Daily Tech Digest - September 25, 2017

Deloitte hit by cyber-attack revealing clients’ secret emails

The Guardian understands Deloitte clients across all of these sectors had material in the company email system that was breached. The companies include household names as well as US government departments. So far, six of Deloitte’s clients have been told their information was “impacted” by the hack. Deloitte’s internal review into the incident is ongoing. The Guardian understands Deloitte discovered the hack in March this year, but it is believed the attackers may have had access to its systems since October or November 2016. The hacker compromised the firm’s global email server through an “administrator’s account” that, in theory, gave them privileged, unrestricted “access to all areas”. The account required only a single password and did not have “two-step“ verification, sources said.


Let’s Not Get Physical: Get Logical

In the ideal future, there would be no programmers responsible for data movement. Instead, the data infrastructure would provide the illusion that all data is almost instantly available at the physical point of its need. Data consumers, including data analysts, would log on to a data catalog, shop for, and request the data they needed. That data would be described at a high level, with its business meaning clearly spelled out for both the human and the machine. (We call that computable meaning.) When a user requested data to be delivered to a certain point (perhaps a virtual point in the cloud), the data infrastructure would start copying the data from its origin, using replication techniques—meaning no potentially deforming transformations would be built into the data movement.



How to Survive Wall Street’s Robot Revolution

Consider the junior investment banker, who spends much of his or her time collecting and analyzing data and then creating reports. Consulting firm Kognetics found that investment-banking analysts spend upwards of 16 hours in the office a day, and almost half of that is spent on tasks like modeling and updating charts for pitch books. Machine learning, and natural language processing techniques, are already very good at this. Workers in compliance and regulation have a different worry: Over the last five years, their ranks have doubled, while overall headcount at banks declined 10 percent, according to research by Citigroup. Automating those activities — so-called regtech — could be good news for financial institutions looking to control the rising cost of compliance, and bad news for people looking to keep their jobs.


Data Governance: Just Because You Can, Doesn't Mean You Should

The impact of data use by businesses and government organizations on individuals, communities, and the environment is under constant scrutiny around the world. We are starting to see this formalized with security and privacy regulations such as the EU’s General Data Protection Regulation (GDPR) and the Privacy by Design approach for data systems. But even adhering to legal requirements and compliance regulations will not be enough to protect the business when it comes to ethical data use. Why? Ethical concerns precede legal and compliance requirements. And the stakes are large. Brand reputation is at risk. One wrong move could cause a significant loss, if not the whole business to fail.


Transforming processes with big data: Refining company turns to SAP Process Mining

A key component of the effort to improve process management is SAP Process Mining by Celonis 4.2.0, a process mining software that uses "digital traces of IT-supported processes" to reconstruct what happens in a company. The application shows all of the process variants, and it provides visualization of all currently running processes. The technology is expected to play a critical role in the effort to enhance processes, providing full transparency and analysis so the company can observe business processes directly from the vast data present in IT infrastructure systems such as its SAP enterprise resource planning (ERP) platform. Based on the analytical findings and process key performance indicators (KPIs), the company will be able to identify process improvement opportunities, Rajatora said.


From accounting to code: one woman’s journey to a career in tech

The pressure to find that first role can feel overwhelming, and often people take the first semi-suitable job they find, at the expense of their actual passions. Getting that first experience may well open the doors to something better, but it could also colour your experience of this new industry, for better or worse. As far as I was concerned, I’d had a lot of experience working with traditional banks in my previous role, and spent at least four or five hours each day attempting to complete straightforward tasks across seven banks in five different countries. This meant that fintech and its potential to transform the banking landscape felt like a very attractive prospect to me, and that Starling Bank’s mission was something I felt strongly about.


The Battle for the Cloud Has Not Even Started Yet

The real war will break out when solutions, offered via the cloud, can support business innovation and business differentiation: When cloud solutions drive business benefit directly and not benefits to IT. For that to happen we need to talk about what a business does (its business processes and decisions) and how a business operates, not what IT does and how IT operates. This might seem like a small point but in the overall scheme of things, in the overall war, I think this is a massive point. If I am lucky I might even be around long enough to be proven right (or wrong). So this is where my little framework starts to be useful. Yes, IaaS is a well-known battle field and the armies are out there fighting it out. Of the next battle fronts, PaaS and SaaS will form up. In fact they are forming up already though they are not seen as important yet by many.


Digital is a Strategic Vehicle for Business Disruption

According to the research findings, the top three success factors for customer experience transformation is: 1. customer centric culture, 2. management/leadership buy-in, and 3. visibility into and understanding of the end customer experience. The research also revealed that customer experience (CX) leaders are more likely to be using emerging technologies and creating personalized and omni-channel experiences. CX leaders are also more likely to use data to predict and anticipate consumer needs, understand lifetime value, and track customer advocacy. CX leaders also have a much higher sense of urgency - they believe there is no time to waste in transforming to deliver a superior customer experience. Data is at the heart of meeting the elevated expectations of today’s connected customers.


CISOs' Salaries Expected to Edge Above $240,000 in 2018

A candidate's skills, experience, and the complexity of the role will all need to be taken into consideration when assessing which salary percentile is appropriate. "The midpoint salary is a good indicator of someone who meets the requirements of an open role," Reed says. The midpoint range for CISOs and information systems security managers have improved over the past couple of years. For example, the Dark Reading 2016 Security Salary Survey found the median annual salary of IT security management was $127,000. But fast forward to 2018: the Robert Half Technology survey expects information systems security managers to earn as much as $194,250 if in the 95th percentile salary range, followed by $164,250 for the 75th percentile, $137,000 at the midpoint, and $115,250 at the 25th percentile, according to the report.


Facebook Relents to Developer Pressure, Relicenses React

"We won't be changing our default license or React's license at this time," said Wolff, who apologized "for the amount of thrash, confusion, and uncertainty this has caused the React and open source communities." Furthermore, he said, "We know this is painful, especially for teams that feel like they're going to need to rewrite large parts of their project to remove React or other dependencies." One developer in that camp is Matt Mullenweg -- the main guy behind the popular WordPress platform -- who threatened to redo project Gutenberg, a "block editor" from the WordPress community designed "to make adding rich content to WordPress simple and enjoyable." "The Gutenberg team is going to take a step back and rewrite Gutenberg using a different library," Mullenweg said in a Sept. 14 post.



Quote for the day:


"No plan survives contact with the future. No security is future proof. That's the joy and terror of cyber security." -- J Wolfgang Goerlich‏


Daily Tech Digest - September 24, 2017

How to Get One Trillion Devices Online

I think it’s easy to paint the optimistic picture of what, if we get all of this right, it could mean for our future. One trillion devices isn’t an absurd number. But these types of new technology can be very fragile. It’s interesting comparing CRISPR [the gene-editing technology] to genetically modified crops: GM crops had some bad publicity early on, and that essentially killed the area for a while, whereas CRISPR has had lots of positive publicity: it’s cured cancer in children. IoT will be similar. If there are missteps early on, people will lose faith, so we have to crack those problems, at least to a point where the good vastly outweighs the bad.


The developers vs enterprise architects showdown

Planning out and managing microservices seems like another area where EAs have a strong role for both initial leadership and ongoing governance. Sure, you want to try your best to adopt this hype-y practice of modularising all those little services your organisation uses, but sooner or later you’ll end up with a ball of services that might be duplicative to the point of being confusing. It’s all well and good for developer teams to have more freedoms on defining the the services they use and which one they choose to use, but you probably don’t want, for example, to have five different ways to do single sign-on. Each individual team likely shouldn’t be relied upon to do this cross-portfolio hygiene work and would benefit from an EA-like role instead, someone minding the big ball of microservices.



brain

Human Brain Gets Connected to the Internet for the First Time

“Brainternet is a new frontier in brain-computer interface systems,” said Adam Pantanowitz, ... According to him, we’re presently lacking in easily-comprehensible data about the mechanics of the human brain and how it processes information. The Brainternet project aims “to simplify a person’s understanding of their own brain and the brains of others.” “Ultimately, we’re aiming to enable interactivity between the user and their brain so that the user can provide a stimulus and see the response,” added Pantanowitz, noting that “Brainternet can be further improved to classify recordings through a smart phone app that will provide data for a machine-learning algorithm. In future, there could be information transferred in both directions – inputs and outputs to the brain.”


Impact of Cyber Security Trends on Test Equipment

The trend of applying cyber security practices to test systems makes sense for several reasons, most notably the increased cyber-security incidents that exploit unmonitored network devices. The second reason this trend makes sense is that security practices and technology for general-purpose IT systems are more mature. However, this trend does not make sense categorically for at least two reasons. Primarily, IT-enabled test systems are less tolerant of even small configuration changes. Users of IT systems can tolerate downtime and may not even perceive application performance differences, but special-purpose test systems (especially those used in production) often cannot tolerate them. Second, test systems often have security needs that are unique. They typically run specialized test software not used on other organization computers


This Is What Happens When a Robot Assassin Goes to Therapy

In an email to Singularity Hub, series creator EJ Kavounas said, “With everyone from Elon Musk to Stephen Hawking making dire predictions about the possible dangers of machine intelligence, we felt the character could inject black comedy while discussing real issues of consciousness and humanity’s relationship with the unknown.” Nina starts with Alastair Reynolds, a psychiatrist. During their meeting she explains her past to him, and after watching a recording in which she detonated a missile to kill someone, she breaks into tears. So we know she has feelings—or at the very least, she’s good at faking them. “The biggest thing I try to keep in mind when playing Nina is that everything she does and says was specifically programmed to mimic human behavior and language,” according to actor, Lana McKissack, who plays Nina.


What is cellular IoT and what are its use cases

While LoRa offers the benefit of addressing ultra-low-power requirements for a range of low-bit-rate IoT connectivity, it is faced with a range limitation and must piggyback an intermediary gateway before data can be aggregated and sent to a central server. The cost of deploying multiple gateways for a range of different IoT scenarios would defeat the very economic purpose of using an arguably low-cost solution like LoRa. Moreover, solutions like LoRa are not suited for a wide range of those IoT applications where HD and ultra-HD streaming is a prerequisite. 5G would potentially address a range of both low-bit-rate and ultra-HD IoT connectivity requirements, while also obviating the need to have an intermediary gateway, thus leading to additional cost savings. Moreover, 5G would have the potential to cover as many as one million IoT devices per square kilometer


Gel-soaked conductive ‘fabric’ has potential for energy storage

As electric power becomes more important for everything from ubiquitous computing to transport, researchers are increasingly looking for ways to avoid some of the drawbacks of current electricity storage devices. Whether they are batteries, which release a steady stream of electric current, or supercapacitors, which release a sharper burst of charge, storage devices depend on conductive electrolyte fluids to carry charge between their electrodes. Susceptible to leakage and often flammable, these fluids have been behind many of the reported problems with batteries in recent years, including fires on board aircraft and exploding tablet computers (the later being caused by short-circuiting inside miniaturised batteries).


Lambda vs. EC2

Unlike its predecessors, the underlying Lambda infrastructure is entirely unavailable to sysadmins or developers. Scale is not configurable, instead Lambda reacts to usage and scales up automatically. Instead of using EC2, Lambdas instead use ECS, and the containers are not available for modification. In place of a load balancer, or an endpoint provided by Amazon, if you want to make Lambdas accessible to the web it must be done through an API Gateway, which acts as a URL router to Lambda functions. ... One of the major advantages touted by Amazon for using Lambda was reduced cost. The cost model of Lambda is time-based: you’re charged for requests and request duration. You’re allotted a certain number of seconds of use that varies with the amount of memory you require. Likewise, the price per MS varies with the amount of memory you require.


Artificial Intelligence: The Gap between Promise and Practice

The majority of companies underestimate the importance of rich and diverse data sets to train algorithms, and especially the value of “negative data” associated with failure to successfully execute a task. Talent shortages and unequal access to data engineers and AI experts compound matters. Privacy and other regulations as well as consumer mistrust also temper progress. Whereas such barriers may be expected to decrease over time, there are also more subtle barriers to AI’s adoption that will need to be overcome to unlock its full potential. Algorithmic prowess is often deployed locally, on discrete tasks; but improved learning and execution for one step of a process does not usually improve the effectiveness of the entire process.


Researchers Develop Solar Cells That Can Be Sewn Onto Clothing

“The ideal wearable portable solar cell would be a piece of textile. That exists in the lab but is not a sellable product.” This new research from the RIKEN and Tokyo teams has taken that textile a big step forward from lab curiosity to actual product. What they have done is create a cell so small and flexible that it could, in time, be seamlessly woven into our clothing, rather than awkwardly placed on the outside of a jacket. These solar cells are phenomenally thin, measuring just three millionths of a meter in thickness. Given a special coating that can let light in while keeping water and air out, the cell was able to keep efficiently gathering solar energy even after being soaked in water or bent completely out of its original shape.



Quote for the day:


"Change is the end result of all true learning." -- Leo Buscaglia


Daily Tech Digest - September 23, 2017

Domain-Driven Design Even More Relevant Now

Compromises and trade-offs in software are unavoidable, and Evans encouraged everyone to accept that "Not all of a large system is going to be well designed." Just as "good fences make good neighbors", bounded contexts shield good parts of the system from the bad. It therefore stands to reason that not all development will be within well-defined bounded contexts, nor will every project follow DDD. While developers often lament working on legacy systems, Evans places high value on legacy systems, as they are often the money makers for companies. His encouragement for developers to "hope that someday, your system is going to be a legacy system" was met with applause.


At its most basic level, the monetary system is built around the idea of storing and transferring value. Banks are not going to disappear; there are still high-level efficiencies and advantages to having banks aggregate stored value and deploy it at a targeted rate of return. For example, a bank can write thousands of mortgages and then securitize a portion of said mortgages; this is never going to be a process suitable for the crowdfunding model. Blockchain technology creates numerous benefits across industries and applications, especially in regard to value-transfer. Banks can realize extraordinary efficiencies, streamline their back-office functions and reduce risk in the process. Smart contracts introduce the added dynamic of constraints and conditional operations for transferring or storing value only when certain conditions have been met and verified.


The Digital Twin: Key Component of IoT

In the real world, this might be a machine going into different fault and run states, where the effect of an input on the machine's state depends on the state the machine is in at the time. If I go far enough back in time, I realize that my system did receive an input "A", and so by the rules of my system, the later "B" results in my model producing the output "X". However if I don't go back far enough, I will think that I only got a "B", and the output should be "Y". But how far back is "far enough"? The input "A" might have arrived 100 milliseconds ago, or it might have arrived yesterday, or just before the week-end. Which means that I cannot just pick up and run my model over a selected time period any time I want to get an answer -- apart from the sheer impracticality of crunching the numbers while the User waits for an answer.


How Startups Can Source Data To Build Machine Intelligence

Data is the fuel of the new AI-based economy. Companies, consumers and web-connected devices create terabytes of data that enforce AI research and innovation. Some companies, like Google and Facebook, acquire data thanks to their users who provide ratings, clicks and search queries. For other companies, data acquisition may be a complicated process, especially if they need an enterprise solution for a limited number of members instead of a one-size-fits-all solution for millions of users. Luckily, the emerging AI markets offer a broad range of options for companies to kickstart their AI strategies. As a venture studio partner, I see startups struggling with sourcing the initial data sets for their business problems. That's why I've listed the most popular ways young companies can source data for their AI businesses.


The Challenges of Developing Connected Devices

Many startups can afford to be scrappy at the start and only have a few employees while gaining momentum; when your product is a connected device it is more difficult to build a small team with the range of skills needed to launch a successful product. Luckily, there are plenty of external resources available to these companies that can help. If a founding team is strong with hardware, they can use an agency in order to get their first software suite built. There are also services that they can leverage to help with the build and distribution chain. Any place where work can be offloaded in order to focus on value increases their chances of success. They can then start hiring out a team to save money once they have traction.


bitcoin-18135031280pd.jpg

New alliance advocates the blockchain to improve IoT security, trust

The alliance says that the groups' open-source tools and property will help the enterprise register IoT devices and create event logs on decentralized systems, which in turn will lead to a trusted IoT ecosystem which links cryptographic registration, "thing" identities, and metadata ... "The world is beginning to recognize the potential of blockchain technology to fundamentally reshape the way business is done globally - and we're still just scratching the surface," said Ryan Orr, CEO of Chronicled. "At this early stage we think it's vitally important to establish an inclusive framework that ensures openness, trust, and interoperability among the many parties, in both the public and private sectors, that we believe will begin to adopt blockchain technology over the next several years."


Ethereum’s inventor on how “initial coin offerings” are a new way of funding the internet

In general, you know that when you have public goods, public goods are going to be in very many cases underfunded. So the interesting thing with a lot of these blockchain protocols is that for the first time you have a way to create protocols and have protocols that actually manage to fund themselves in some way. If this kind of approach takes off, potentially, it could end up drastically increasing the quality of bottom-level protocols that we use to interact with each other in various ways. So ethereum is obviously one example of that, we had the ether sale, and we got about $8 to $9 million by, I guess, basically selling off a huge block of ether. If you look at lots of cryptocurrencies, lots of layer-two kind of projects on top of ethereum, a lot of them tend to use a similar model.


New Theory Cracks Open the Black Box of Deep Learning

It remains to be seen whether the information bottleneck governs all deep-learning regimes, or whether there are other routes to generalization besides compression. Some AI experts see Tishby’s idea as one of many important theoretical insights about deep learning to have emerged recently. Andrew Saxe, an AI researcher and theoretical neuroscientist at Harvard University, noted that certain very large deep neural networks don’t seem to need a drawn-out compression phase in order to generalize well. Instead, researchers program in something called early stopping, which cuts training short to prevent the network from encoding too many correlations in the first place.


How to Measure Continuous Delivery

Continuous delivery is all about improving the stability and speed of your release process, so unsurprisingly you should measure stability and speed! Those are intangibles, but they’re not hard to measure. In How To Measure Anything, Douglas Hubbard shows how to use clarification chains to measure intangibles - you create tangible, related metrics that represent the same thing. Luckily for us, the measures have been identified for us. In the annual State Of DevOps Report Nicole Forsgren, Jez Humble, et al. have measured how stability and throughput improve when organisations adopt continuous delivery practices. They measure stability with Failure Rate and Failure Recovery Time, and they measure throughput with Lead Time and Frequency. I’ve been a big fan of Nicole and Jez’s work since 2013


The Decline of the Enterprise Architect

No matter their place in a lumbering bureaucracy or how many eye-rolls they may inspire among developers, these people are smart, competent, and valuable to their organizations. So my opinions and criticisms have nothing to do with the humans involved. That said, I think this role is on the decline, and I think that’s good. This role exists in the space among many large software groups. In the old days, they coordinated in elaborate, mutually dependent waterfall dances. These days, they “go agile” with methodologies like SAFe, which help them give their waterfall process cooler, more modern sounding names, like “hardening sprint” instead of “testing phase.” In both cases, the enterprise architect has a home, attending committee-like meetings about how to orchestrate the collaboration among these groups.



Quote for the day:


"Your excuses are nothing more than the lies your fears have sold you." -- Robin Sharma