Daily Tech Digest - August 08, 2017

CIO interview: David Ivell, CIO, The Prince’s Trust

According to the CIO, the methodology allows the charity to use a test-learn-adapt approach in which the content, experience, platform and online mentoring are all being tested at scale and refined in real time. This approach also enables the team to quickly identify how the business will need to adapt because of the change process, as well as where the bottlenecks are and what technology is needed to bypass them. “We took technology components off the shelf and put them in our business, and rather than evaluate, we asked the business to tell us how it would make them work,” says Ivell. “We have worked with organisations around the world that are seen as leaders in the e-mentoring space and have learnt from others.”


Malicious code in the Node.js npm registry shakes open source trust model

Between July 19 and July 31, an account named hacktask published a series of packages on npm with names that were similar to existing npm packages, wrote npm CTO CJ Silverio. Packages are used by developers to implement common functions without having to write the code from scratch. If developers aren’t careful and add the wrong packages as dependencies to their code, they wind up with malicious code in their applications. “The package naming was both deliberate and malicious—the intent was to collect useful data from tricked users,” Silverio said. The account hacktask has been closed, all packages associated with the account removed from npm, and the user’s email address banned from using npm.


Google Teach New Managers These 6 Things

Implementing research from Dr. Carol Dweck, professor of psychology at Stanford University, Google encourages its managers to develop a growth mindset. As opposed to a fixed mindset (the belief that skills and abilities are predetermined), individuals with a growth mindset believe that intelligence can be cultivated. This simple idea develops leaders who are more eager to learn, challenge themselves, and experiment, and it eventually boosts their performance. Although success will always require tenacity, hard work, and concentration, this research suggests these traits are byproducts of a quality that underpins them, optimism. Also, Google encourages its managers to identify values and leverage them within their management styles.


Don’t waste your time with a hybrid cloud

The problem with hybrid clouds is that they are typically defined to be paired private and public clouds, which is the correct definition. The problem with this architecture is that they have the concept of “private cloud” in the architecture, and that dog does not seem to hunt anymore.  The feature gap between public and private clouds has grown so wide that the private cloud demos that I attend are laughable considering the subsystems that enterprises need, such as security, governance, databases, IoT, and management, versus what private clouds actually deliver.  Moreover, you’re on the hook for installing the software on typically new on-premises systems, hooking everything up, testing it, and making that private cloud work and play well with a public cloud provider that may do a poor job in providing the integration. At the end of the project, you’ll feel like an abused spouse.


Cyber resilience weaves cybersecurity into dev process

Cyber resilience is a concept that is similar to cybersecurity. I think it's kind of grown out of the cybersecurity realm, but it's broader. It encompasses not just security, but the stability, the integrity and the availability of the environment that you're developing or that you're bringing to your customers. Traditionally, the security concerns have been siloed to the CISO's office. And the disaster recovery and business continuityconcerns have been siloed to the operations team, which is a different team. The CISO typically sits separate from the operations team and separate from the development team. The CISO and the [quality assurance] organization oversee development, and it's been a piecemeal approach to making sure that what you deliver is actually stable, it's robust, it's resilient and it's secure.


Machine learning: A chance for engineering students to look beyond software services

“The good part is courses are available online. Instead of working with the AICTE (All India Council for Technical Education) to change curriculum, which takes a long time, we are working to get students and professionals to access these courses,” says Sangeeta Gupta, senior vice -president of Nasscom, the apex body for the $154 billion tech and BPO industry.  Nasscom has yet to analyse the job scenario, estimate the number of people needed, or the number of people who might lose jobs because of new technologies. Gupta talks about platforms for MOOCs, or massive online open courses, like Coursera, Udacity and edX, through which Nasscom is trying to address the problem of gap between existing industry demand and lack of skills both within the industry as well as among final-year engineering students.


UK calls for smart car cyber protection

Transport minister Martin Callanan said it is important that smarter and self-driving technologies are protected against cyber attacks. “That’s why it’s essential all parties involved in the manufacturing and supply chain are provided with a consistent set of guidelines that support this global industry. Our key principles give advice on what organisations should do, from the board level down, as well as technical design and development considerations,” he said. Mike Hawes, chief executive of the Society of Motor Manufacturers and Traders, welcomed the government initiative: “We’re pleased that government is taking action now to ensure a seamless transition to fully connected and autonomous cars in the future and, given this shift will take place globally, that it is championing cyber security and shared best practice at an international level.”


Cisco admits accidentally losing customer data due to Meraki cloud configuration error

Cisco did not specify how many customers' were affected in the incident. Its Meraki service is used by over 140,000 customers and 2 million network devices, according to the company's website. Customer data erased include Meraki dashboard custom splash themes, custom floor plans, branding logos, summary reports and uploaded device placement photos. Other data deleted in the incident include custom enterprise apps, interactive voice response menus, music on hold, contact images and voice mail greetings. The latest cloud-related incident comes amid security experts' growing concerns about digital and cloud security following numerous gaffes that have led to users' data being erroneously publicly exposed.


Android Vs. IOS: Which Is More Secure?

Android is expected to maintain the lead this year, according to Forrester, with 74% market share, followed by Apple with 21% and Windows Phone with just 4%. "The truth is, when Android gets attacked, it tends to be more vulnerable because there are more devises out there and more people also hear about it," Gold said. "Android also has a problem in that the latest version of Android OS is generally a small portion of the base of devices in the marketplace. So, when upgrades are issued, not everyone gets them. Whereas, when Apple upgrades, everyone gets it." Additionally, as enterprises develop more of their own custom applications -- many of them mobile apps as part of a mobile-first strategy -- in-house developers are increasingly at risk of unwittingly using open-source code rife with vulnerabilities.


Get started with Visual Studio Code

Built using GitHub’s cross-platform Electron framework, Visual Studio Code is a full-featured development editor that supports a wide selection of languages and platforms, from the familiar C and C# to modern environments and languages like Go and Node.js, with parity between Windows, MacOS, and Linux releases. Visual Studio Code quickly became a standard part of my personal device setup, replacing Notepad as my default text editor, and its now one of the first tools I install on a new PC. With its support for IntelliSense code highlighting, it’s also now my standard code viewer for web content, and it’s where I build and test JSON and JavaScript, for working with microservices and for configuring containers. Visual Studio Code has even added support for a command-line terminal, including the Windows Linux Subsystem, so you can use it to build and test Unix apps without having to leave your PC.



Quote for the day:


“People haven't even begun to tap into the potential of what the mind is possible of doing...” -- David Blaine


Daily Tech Digest - August 07, 2017

WannaCry 'hero' to pay $30,000 for bail, plead not guilty to Kronos trojan charges

Be a “hero” to the internet; come to the U.S. and get arrested. That is the situation that shook the security community when the FBI arrested British security researcher Marcus Hutchins after he left Def Con. Hutchins, aka MalwareTech, was arrested Aug. 2 for allegedly creating the banking trojan Kronos. Earlier this year, Hutchins was dubbed a hero for finding the WannaCry ransomware kill switch and was then doxed by reporters as a show of gratitude. His bail was set at $30,000, yet he spent the weekend in jail because there wasn’t enough time to pay the bail before the clerk’s office closed on Friday. After he is released on Monday, Hutchins will remain in the U.S. with GPS monitoring and go to Wisconsin where he will face a six-count federal indictment;


Why continuous learning is key to AI

Why do you need a machine learning library and what algorithms are important for continuous learning? Recall that in RL one needs to learn how to map observations and measurements to a set of actions, while trying to maximize some long-term reward. Recent RL success stories mainly use gradient-based deep learning for this, but researchers have found that other optimization strategies such as evolution can be helpful. Unlike supervised learning where you start with training data and a target objective, in RL one only has sparse feedback, so techniques like neuroevolution become competitive with classic gradient descent. There are also other related algorithms that might become part of the standard collection of models used for continuous learning


Fintech’s Artificial Intelligence Revolution: The Missing Link

Technology waits for no rules or regulations, and AI is no different. The potential profit associated with innovation economics also contains the risk that machine intelligence will be developed and deployed without thoughtful consideration of the potential perils. And AI brings a unique set of risk challenges. If they are not well managed, we may create new and greater risks. ...  Should something go wrong, we might not be able to define the problem a solution. Already problems with technology not nearly as complex as deep learning have disrupted the markets. There have been six “stock market crashes” due largely to flawed market operations, most notably the “flash crash” of 24 August 2015. And while markets recovered from these crashes rather quickly, the immediate causes were not immediately grasped.


6 Digital Strategies, and Why Some Work Better than Others

Digitization is enabling new, disruptive models that aggressively compete with legacy models, putting material pressure on incumbents’ revenue and profit growth. As incumbents fight back with their own digital strategies, our research shows that they often trigger a second wave of competition, closer to the notion of Schumpeterian imitation where incumbents start themselves to innovate, sometimes aggressively, against the threat of entrants slashing yet more revenue and profit growth. We estimate that on average, both waves of digital competition has taken out half of the annual revenue growth and one third of the growth in earnings from incumbents that have failed to respond to digital.


Open source is powering the digital enterprise

By leveraging broad based collaboration and strong communities of independent developers, open source innovation is transforming the very core of information technology and enabling organizations to win in today’s digital economy. As a community we all gain from these efforts. Technology is now being strategically applied to support the fast-paced and rapidly changing demands of businesses and the customers they serve. In the new digital reality, an organization’s cloud, application, IoT, and data analytics strategies (to name a few) can make or break an organization’s success. Businesses are realizing they are now in the technology business. This drive to meet business needs for innovation has in-part led to a surge in the adoption of enterprise-grade open source technology.


Robots created a language. No need to panic

The emergence of communication between "agents" -- individuals who don't have a common language to start with -- has been studied with the help of computer simulations since the 1990s. The same mechanisms emerged as in the more contemporary work: computer programs, like humans, end up finding optimal ways to communicate. The findings have been exciting to those interested in the origins of language, but they aren't about any kind of diabolical superintelligence. Our distant ancestors, who were not particularly smart, also found ways to talk to folks from other tribes, and to bargain with them if necessary. That machines can do it, too, when set a specific task on which they must work until a set outcome is achieved, is a far cry from Skynet dystopia.


How to (not) use the large object heap in .Net

Copying and moving large objects not only would involve significant overhead for the garbage collector – the GC would need twice as much memory for garbage collection – but moving large objects would be very time-consuming as well. Therefore, unlike the small object heap, the large object heap is not compacted during garbage collection. So, how is memory in the large object heap reclaimed? Well, the GC never moves large objects – all it does is remove them when they are no longer needed. In doing so, memory holes are created in the large object heap. This is what is known as memory fragmentation. One point to note here is that although the GC doesn’t compact the large object heap, it does combine adjacent free blocks in the heap to make larger blocks available.


Six Ways to Curb the Costs of a Data Breach

Missteps happen fast and have serious consequences. One example is customer communications. After a breach, the pressure to communicate quickly with customers can be intense. But ineffective communications can cause panic, dramatically increasing the rate at which customers phone into call centers and sign up for credit monitoring. Credit monitoring alone can cost $5 to $30 per person. Data breach specialists, such as PR consultants or data privacy lawyers, often have seen as many as hundreds of data breaches and are highly practiced at helping you craft a genuine story that keeps confusion – and costs – down. ... In the wake of a breach, a company may be investigated by a number of regulatory agencies. While it’s not guaranteed to occur, it is likely, and there are simple steps you can take to prevent sensational fines if it does.


What Women in Cybersecurity Really Think About Their Careers

Fewer than half came to security via IT or computer science. The rest came from backgrounds in compliance, psychology, internal audit, entrepreneurship, sales, and art. Ten percent say they joined the industry because they "like to break things." "Women in this field say it's actually fun, and they're having a good time. They are feeling they are doing meaningful and impactful work and it's deeply satisfying to them," says Wong, who also conducted deep-dive interviews with multiple women from the survey who were willing to be quoted in the final report. "You don't necessarily have to have a computer science degree to contribute." Nearly three-quarters of them say the value they bring to cybersecurity is their ability to communicate well across cross-functional teams.


Cyberwar: A guide to the frightening future of online conflict

The tools of cyberwarfare can vary from the incredibly sophisticated to the utterly basic. It depends on the effect the attacker is trying to create. Many are part of the standard hacker toolkit, and a series of different tools could be used in concert as part of a cyber attack. For example, a Distributed Denial of Service attack was at the core of the attacks on Estonia in 2007. Ransomware, which has been a constant source of trouble for businesses and consumers may also have been used not just to raise money but also to cause chaos. There is some evidence to suggest that the recent Petya ransomware attack which originated in Ukraine but rapidly spread across the world may have looked like ransomware but was being deployed to effectively destroy data by encrypting it with no possibility of unlocking it.



Quote for the day:


"Mistakes are always forgivable, if one has the courage to admit them." -- Bruce Lee


Daily Tech Digest - August 06, 2017

Machine learning, artificial intelligence and robo-advisers: The future of finance?

One issue relating to neural network–based machine learning–enabled AI applications in investment management is the black box issue, in which the workings of an algorithm are not understood by its user or other stakeholders and lead to potentially unintended actions or consequences. This is a well-known headache for regulators trying to ensure market stability. Although some attempts have been made to check the source code of algorithmic traders, the most effective protection against algorithmic errors are circuit breakers on markets that limit the amount of damage a failing algorithm can cause. We highlighted in our response to the EC’s consultation document on fintech an example from the world of algorithmic trading on the use of circuit breakers as a de facto solution to the black-box challenge.


How to Avoid a Total Project Meltdown

The better you know how to manage projects, the easier it will be to take on bigger challenges and contribute more to your company’s success. While it’s important to develop effective project management skills, you must also learn to recognize the common pitfalls that can quickly derail your efforts. By doing so, you’ll have no problem dodging these issues and letting your superior project management skills speak for themselves. Always take a top-down project management approach. This will ensure you’re focused on who will be leading various elements of the project. While every other part of your project is important (e.g., who else will be involved, what resources you’ll need, etc.), if you don’t pick strong team leads, you may be setting the project up for failure.


Five Key Attributes of True Software-Defined Storage

Common issues include the lack of cloud economics and complexity leveraging existing infrastructure. While these experts are certainly raising real concerns about the validity of SDS, we must first take a step back to understand the root of these issues; that being said: the promise of the technology will only align with the hype once the industry can agree on a set definition of what it means to be truly “software-defined.” In order to bridge the gap of understanding these nuances of SDS, we need to agree on a definition once and for all to help us look past the industry jargon to truly get to the heart of the technology and how it can help companies achieve simple and cost-effective data management. Simply put, SDS is an approach to data storage in which the programming that controls software-relate tasks is decoupled from the physical storage hardware.


The enterprise technologies to watch in 2017

We've seen the steady shift from SMAC (social, mobile, analytics, cloud) that dominated this list at its inception to one that is more focused on artificial intelligence, Internet of Things, distributed ledgers, immersive digital experiences (AR/VR), edge computing, low code tools, and much more. That's not to say that essentially mainstream technology bases like public cloud, cybersecurity, or big data are staid and therefore are about to come off the list. In fact, they are shifting and evolving more now than ever before and should remain at the top of the technologies that most enterprises should be watching very closely today. Based on my analysis then, here is the short list of enterprise technologies that organizations should be tracking for building skills, assessing their strategic and tactical impact, experimenting with


5 Industries That Will Be Using Blockchain Sooner Rather Than Later

Lauded as the next level of distributed internet, blockchain technology enables unprecedented transparency, efficiency and flexibility in terms of facilitating transactions without a centralized control or administration. The distributed ledger technology has powered cryptocurrencies like bitcoin and Ethereum, but it is not only in fintech where it shines. Any industry that involves peer-to-peertransactions will also benefit from its distributed and peer-authenticating features. By taking away the middleman or central authority in establishing trust, and by ensuring that transactions can be audited by any and all parties involved, the blockchain provides a perfect way to protect against potentially fraudulent transactions. That's not to say that the tech is immune from external security threats, but the system in itself provides trust intrinsically.


The Buyers: Consumers, Post-Hacking

Consumers should probably go ahead and assume that at one point or another their personal details will likely be breached by an online hacker. With nearly every possible part of people’s lives living online, and sometimes on an internal intranet as well, it’s safe to assume that nothing is truly, well, safe. Beyond the usual advice of changing passwords upon hearing of data breaches, there’s not much consumers can really do. With all of this knowledge, the issue that then arises for retailers is how to bring consumers back to the path of purchasing — post-hacking occurrence. Typical advice from top security officials can range from being careful with third-party vendors to giving consumers more control over their own data. Consumer insights company Diginomica’s co-founder, Jon Reed, offered a succinct way for retailers to gain and maintain consumer trust.


How much is diversity in tech worth? $400B says CompTIA CEO

Thibodeaux clarified that a tech skills gap isn't necessarily the problem, but instead a confidence gap is what prevents many women and people of color from pursuing jobs in the tech industry. TechRepublic's Hope Reese noted the women's confidence gap after watching University of Louisville's inaugural Women in Leadership Forum. "The panel agreed that women often judge themselves too harshly as well, saying 'I'm not good enough,' far more often than men," Reese wrote. A lack of career information is another big reason many professionals don't join the tech workforce, Thibodeaux said. Organizations need to provide career information to a more diverse group, Thibodeaux added, or else these individuals won't know to pursue tech careers.


Making 100% code coverage as easy as flipping a coin

Shipping a new version of a distributed, scale-out file system every two weeks means we have to be confident that every commit to our codebase is bug-free. To that end, we have tens of thousands of unit tests, thousands of integration tests, and many hundreds of full system tests that batter every build to verify that we haven’t introduced a regression. While these kinds of tests are important, we wanted to go further. We wanted a way to execute everypossible path through a particular piece of code so that we could verify that, no matter what, the behavior of the code was correct and the invariants of the system held. A traditional approach to this problem might be to manually inspect code coverage reports and craft individual tests to exercise each branch of the code, but this is brittle because it requires a human to inspect the code coverage and it often involves a complex, sometime convoluted test to exercise the uncovered code.


Code Coverage Should Not Be a Management Concern

Before going any further, I’ll quickly explain the concept of code coverage for those not familiar, without belaboring the point. Code coverage measures the percentage of code that your automated unit test suite executes when it runs. So let’s say that you had a tiny codebase consisting of two methods with one line of code each. If your unit test suite executed one method but not the other, you would have 50% code coverage. In the real world, this becomes significantly more complicated. In order to achieve total coverage, for instance, you must traverse all paths through control flow statements, such as if conditions, switch statements, and loops. When people debate how much code coverage to have, they aren’t debating how “correct” the code should be. Nor do they debate anything about the tests themselves, such as quantity or quality.


Cyber Disruption, State Government & The Constitution

A cyber disruption has a duration to it, a reach to it that is bigger than a typical incident. Where an incident might last hours, a cyber disruption would last days, weeks, months. Where a cyber incident might affect a certain application or certain small set of users, a cyber disruption has a bigger reach than that—it could affect an entire enterprise, it could affect a region, a city. The importance that we place on this in the first part of this document is differentiating a disruption—it’s very significant kind of an event. And we anticipate that it will probably be related to some major part of our infrastructure—power disruption, water, sewer, gas delivery, communications. I put a scenario in the front end of that report that describes loss of power and what happens if you find that you don’t have communications either, with no cell or mobile service. A very significant kind of event.



Quote for the day:

"It doesn't matter how much we know, what matters is how clearly others can understand what we know." -- Simon Sinek

Daily Tech Digest - August 05, 2017

When FinTech meets the Internet of Things

The new EU Privacy Regulation is in the process of being adopted and among the changes that are going to be introduced there will be a massive increase of the potential fines up to 4% of the global turnover of the breaching entity. Likewise, the development of technologies such as fintech that require the collection and the analysis of large amounts of data will require a so called “privacy impact assessment” to be submitted and validated by privacy authorities. The implementation of a privacy by design approach can be the sole defense in a regulatory framework where the burden of proof of having complied with regulations will be on the investigated entity i.e. the company exploiting the fintech platform. In addition to privacy issues, there are legal issues as to the ownership of data which need to be reviewed under a privacy, an intellectual property and a contractual confidentiality perspective.


Big Data: 6 Key Areas Every Product Manager Should Address

The two main considerations regarding storage are: how to store and where to store. How to store your data depends on your overall use case. The type of data you produce will determine the type of database you will require. If you have structured data, then a relational database such as SQL Server or MySQL are your best bet. On the other hand, if you have unstructured data such as images, videos, or tweets, then you probably need a schema-less database such as Hadoop or MongoDB. Or maybe, like some systems I’ve worked on, you need both. I’m not suggesting that Product Managers dictate the type of DB or the architecture of the data tier. That’s the role of your Architecture and IT team. However, it IS our job as Product Managers to define clear use cases and convey those to our technical team, so they can implement the right infrastructure for your product.


New fintech partnerships to push automation in trade finance

“We looked at a dozen OCR products, but none served the idea that we had, namely, to have an OCR that could put together all the different formats and make it work. That’s the reason we had to develop it ourselves.” As part of the collaboration, Traydstream have also become a member of PFU’s Imaging Alliance Programme, which gives access to tools, resources and code to build innovative scanner solutions. The data-sharing collaboration with Lloyd’s List Intelligence, meanwhile, will augment Traydstream’s compliance engine, the second part of its solution. This engine uses machine learning algorithms to quickly scrutinise the digital transaction data for a range of issues, such as blank fields, the inconsistency of names, industry-specific legislation, sanctions and country restrictions, to help banks and large corporations tackle anti-money laundering (AML) and compliance issues.


How blockchain is changing the way we invest

What blockchain has essentially done is diversify the means people can invest. People can now invest in cryptocurrencies, startups, and tokenized real-world assets all because of blockchain. The next challenge then is to encourage adoption of these investment vehicles. Cryptocurrencies are getting the most attention as the value of these coins continue to move up and more countries accept these as legal tenders. ICOs are also generating much interest as startups come up with interesting ideas on using blockchain to drive business. These new blockchain trading platforms, while still unproven, offer much promise. “My dream is to make NASDAQ on Blockchain with a wider range of tradable assets and a dramatic reduction of listing costs, settlement time, and transaction costs,” said LAToken CEO Valentin Preobrazhenskiy.


How Machine Learning Is Helping Morgan Stanley Better Understand Client Needs

So Morgan Stanley’s wealth management business unit has been working for several years on a “next best action” system that FAs could use to make their advice both more efficient and more effective. The first version of the system, which used rule-based approaches to suggesting investment options, is being replaced by a system that employs machine learning to match investment possibilities to client preferences. There are far too many investing options today for FAs to keep track of them all and present them to clients. And if something momentous happens in the marketplace — for example, the Brexit vote and the resulting decline in UK-based stocks — it’s impossible for FAs to reach out personally to all their clients in a short timeframe.


Rise of the chatbot: security concerns

As chatbots become increasingly intelligent, they are being equipped with additional capabilities, including the ability to process financial transactions. If you are comfortable using Facebook Messenger or WhatsApp to chat with your friends, then using it to make payments or check your account balance by just a matter of writing a message, versus having to open a new internet banking app, makes sense. This type of transaction through chatbots is already being seen in the US where Uber is integrated into Facebook Messenger, which added a payments solution for businesses in late 2016, meaning users can order and pay for their Uber through a simple message.


The core ethics of data analytics

It was not possible to be prescriptive about how data should be used as that risked limiting both the benefits to the consumer or citizen, and the benefits to the corporation or Government using the data, Mr Sherman said. Regulation would also struggle to keep pace with technology developments he said, and the looming impact of big data, artificial intelligence (AI) and machine learning on privacy was noted by a series of speakers at the conference. Simon Entwistle, deputy commissioner of the UK Information Commissioner’s Officer, noted that as machine learning and AI took hold it would be increasingly difficult to know how people’s data was being used, or whether deidentified data was being transformed into personal data. “Taking an ethical approach is more important than ever – to go beyond compliance to the underlying legal obligations,” he said.


McKinsey argues how the current wave of AI is ‘poised to finally break through’

Robotics and speech recognition are two of the most popular investment areas. Investors are most favoring machine learning startups due to quickness code-based start-ups have at scaling up to include new features fast. Software-based machine learning startups are preferred over their more cost-intensive machine-based robotics counterparts that often don’t have their software counterparts do. As a result of these factors and more, Corporate M&A is soaring in this area with the Compound Annual Growth Rate (CAGR) reaching approximately 80% from 20-13 to 2016. The following graphic illustrates the distribution of external investments by category from the study. These industries are known for their willingness to invest in new technologies to gain competitive and internal process efficiencies.


The Art of Crafting Architectural Diagrams

As Philippe Kruchten said, "architecture is a complex beast. Using a single blueprint to represent architecture results in an unintelligible semantic mess." To document modern systems we cannot end up with only one sort of diagram, but when creating architectural diagrams it is not always straightforward what diagrams to choose and how many of them to create. There are multiple factors to take into consideration before making a decision; for example, the nature and the complexity of the architecture, the skills and experience of the software architect, time available, amount of work needed to maintain them, and what makes sense or is useful for meeting stakeholders concerns. For example, a network engineer will probably want to see an explicit network model including hosts, communication ports and protocols;


Blockchain tokens may be the future of finance — if regulators allow it

Token offerings signal a sea change in the way that blockchain startups and software protocols are being financed. Beyond upsetting the balance of the investor ecosystem, though, this new funding model, and the projects taking advantage of it, may herald the dawn of something even more radical: a decentralized internet powered by applications that blur the line between owners and users. Inevitably, some observers are calling it an overheated market. Some are even comparing it to the dot-com bubble of the early 2000s. Most distressing to traditional investors, and even to early adopters of bitcoin, is the fact that some token projects have raised millions of dollars on the basis of little more than a white paper and a website. The Securities and Exchange Commission recently raised the barrier to entry for American entrepreneurs and investors, and risks putting the kibosh on such innovation altogether.



Quote for the day:


"Learning to ignore things is one of the great paths to inner peace." -- Robert J. Sawyer


Daily Tech Digest - August 04, 2017

Edge computing could push the cloud to the fringe

For some Internet of Things devices, like connected video cameras, it also ceases to be practical to send the data to the cloud just because of the pure volume involved. As an example, he points out that there are already a half a billion connected cameras in place today with a billion expected to be deployed worldwide by 2020. As he says, once you get over 1080p quality, it really ceases to make sense to send the video to the cloud for processing, at least initially, especially if you are using the cameras in a sensitive security zone like an airport where you need to make decisions fast if there is an issue. Then there’s latency. Talla echoes Levine’s thinking here, saying machines like self-driving cars and industrial robots need decisions in fractions of seconds, and there just isn’t time to send the data to the cloud and back.


Much Ado About Blockchain

The fact that government authorities have become so active in blockchain experimentation means that we will soon reach a point where corporations will be expected to interact with those authorities using the distributed ledger technology. Whether through basic blockchain-based registration processes or more elaborate data-transfer protocols, corporations need to start thinking about whether or not their existing technologies and ERP systems can support this type of dialogue. From a financial perspective, the C-suite also needs to think about the competition. Ultimately, the goal of blockchain is to reduce administrative costs. Companies that get out ahead, finding ways to practically leverage the blockchain to make their operations more efficient, will have an edge on their competition when it comes to annual operating costs.


Measuring Data Science Business Value

Just as any other department uses operational metrics to measure the efficiency and effectiveness of their organizations, data science needs to do the same. Sales teams use funnel metrics to measure how effectively their teams are converting prospects down the funnel to closed-won. Engineering teams use sprint burndown and team velocity to measure how efficiently their teams complete work in a given amount of time. Monitoring leading indicators allow these teams to quickly adjust before a revenue number is missed or a product feature is delayed. For data science teams, it is important to identify critical information like when a data scientist is spending precious hours reproducing something someone else has already done. Or when a business stakeholder has not been looped in to give feedback on a project before it is delivered as “done”.


It's time for the financial sector to embrace an AI-led future

Large commercial banks have several back-office processes that AI’s operating models are transforming. Operations such as reconciliation, consolidation and credit risk management, can be completed with robotic process automation (RPA), which along with machine learning can be fully automated. Banking functions that are central to operations such as the quarterly closing and reporting of earnings can be accomplished in real time with AI, allowing for greater accuracy and quicker adjustments. A case in point is the Machine Learning program called COIN used by JPMorgan Chase & Co, one of the biggest banks in the United States. The AI program takes just a few seconds to review commercial loan agreements as compared to the 360,000 hours each year that a team of lawyers and loan officers would take to complete.


Internet of things and blockchains: help or hindrance?

The technology is still immature and likely to transition quickly, and fierce competition from rival distributed ledger platforms is a threat. Moreover, the computing power needed to achieve consensus of distributed transactions is expensive, and more cost-efficient models need to be found. With the likes of BP and Microsoft opting for ethereum, there also needs to be more standardised regulation and stability of the cryptocurrencies so larger enterprises can comfortably roll out the technology for customer use. Blockchains may be able to provide an additional layer of security and operational efficiency to existing IoT models, but the technology is not necessarily needed in every use case. In fact, there are IoT platforms already in place that implement supply chain traceability, monitor product provenance, and ensure product authenticity.


Mark Zuckerberg and Elon Musk's debate over artificial intelligence: will robots go rogue?

“If we’re lucky, they’ll treat us as pets,” says Paul Saffo, a consulting professor at Stanford University, “and if we’re very unlucky, they’ll treat us as food.” But there are academics, including Noam Chomsky, who don’t believe computers will ever be able to attain that level of intelligence. Yes, they might be able to learn to speak Chinese, but will they ever truly understand Chinese, or merely simulate understanding? These questions are all bound up with concepts of intelligence, sentience, self-awareness and consciousness, things that remain stubbornly impervious to scientific analysis. Zuckerberg’s angle on AI, shared by other industry figures such as Google’s Ray Kurzweil, is that super-smart micro-intelligences offer great benefit to mankind and will always remain under our ultimate control.


From barter to blockchain: A history of money

The world is connected through the internet now; it is at everyone’s fingertips. Computers make us more efficient at what we do than we ever were. What used to take us days takes minutes now. We couldn’t make more time, so instead we make more out of the time we have. Computing power is such an integral part of our life that it has become as scarce and valuable as gold was to our ancestors. And when smart people asked the profound question: “Can we make a currency that is backed by computing power?,” the answer was cryptocurrencies and blockchains. If you don’t yet understand how blockchains work, I’ve written the ultimate guide in plain English to help you understand the concept.


How to get started with machine learning

Not surprisingly, software engineering teams are generally not well-equipped to handle these complexities and so can fail pretty seriously. “A good, solid, and comprehensive platform that lets you scale effortlessly is a critical component” to overcoming some of this complexity, Dunning says. “You need to focus maniacally on establishing value for your customers and you can't do that if you don't get a platform that has all the capabilities you need and that will allow you to focus on the data in your life and how that is going to lead to customer-perceived value.” Enter TensorFlow. Open source, a common currency for developers, has taken on a more important role in big data. Even so, Dunning asserts that “open source projects have never really been on the leading edge of production machine learning until quite recently.”


Social Engineering: The Basics

Social engineering has proven to be a very successful way for a criminal to "get inside" your organization. Once a social engineer has a trusted employee's password, he can simply log in and snoop around for sensitive data. With an access card or code in order to physically get inside a facility, the criminal can access data, steal assets or even harm people. Chris Nickerson, founder of Lares, a Colorado-based security consultancy, conducts 'red team testing' for clients using social engineering techniques to see where a company is vulnerable. Nickerson detailed for CSO how easy it is to get inside a building without question. In one penetration test, Nickerson used current events, public information available on social network sites, and a $4 Cisco shirt he purchased at a thrift store to prepare for his illegal entry.


Excel 2016 cheat sheet

If you're working in a workbook you've saved in OneDrive or SharePoint, you'll see a new button on the Ribbon, just to the right of the Share button. It's the Activity button, and it's particularly handy for shared workbooks. Click it and you'll see the history of what's been done to the spreadsheet, notably who has saved it and when. To see a previous version, click the "Open version" link underneath when someone has saved it, and the older version will appear. And there's a very useful difference in what Microsoft calls the backstage area that appears when you click File on the Ribbon: If you click Open, Save or Save As from the menu on the left, you can see the cloud-based services you've connected to your Office account, such as SharePoint and OneDrive. Each location now displays its associated email address underneath it.


Here Are 306 Million Passwords You Should Never Use

The 306 million passwords encompass more than 1 billion compromised accounts. The data comes from rich sources, including the Exploit.in and Anti-Public lists. Both of those lists were massive mash-ups of stolen data covering just over 1 billion email addresses. Hunt calls the service Pwned Passwords. Service providers can use the data in their back-end systems with the aim of improving the state of password security. Hunt has made a 6GB file available with the data. For example, if someone is registering a new account, a service provider can compare the chosen password and warn the individual that the password has been compromised before. At that point, the person can be strongly encouraged or forced to choose a more secure password.



Quote for the day:


"Strategy is not really a solo sport  even if you're the CEO." -- Max McKeown


Daily Tech Digest - August 03, 2017

Rebooting Cybersecurity

Certainly, compliance frameworks and programs help to establish the minimum standards for security and give a company a checkmark during audits, but frameworks and programs often fail to protect a company from breaches. Having frameworks and programs will not be sufficient if they do not reflect real-world dynamics and fail to provide needed monitoring, detection, responses, or protection. ... Only a fifth of respondents would invest in mitigating financial loss, and just 22 per cent would invest in cybersecurity training. ... Protecting a company requires an end-to-end approach that considers threats across the spectrum of the industry-specific value chain and the company’s ecosystem. Business exposure needs to be identified and minimized, with a focus on protecting priority assets.


Why SSL/TLS attacks are on the rise

As enterprises get better about encrypting network traffic to protect data from potential attacks or exposure, online attackers are also stepping up their Secure Sockets Layer/Transport Layer Security (SSL/TLS) game to hide their malicious activities. In the first half of 2017, an average of 60 percent of transactions observed by security company Zscaler have been over SSL/TLS, the company’s researchers said. The growth in SSL/TLS usage includes both legitimate and malicious activities, as criminals rely on valid SSL certificates to distribute their content. Researchers saw an average of 300 hits per day for web exploits that included SSL as part of the infection chain. “Crimeware families are increasingly using SSL/TLS,” said Deepen Desai, senior director of security research at Zscaler.


Five New Threats To Your Mobile Device Security

Today, mobile devices are coming under increasing attack – and no one is immune. Some 20 percent of companies surveyed by Dimensional Research for Check Point Software said their mobile devices have been breached. A quarter of respondents didn’t even know whether they’ve experienced an attack. Nearly all (94 percent) expected the frequency of mobile attacks to increase, and 79 percent acknowledged that it’s becoming more difficult to secure mobile devices. “They’re starting now to become more aware of the possible impact,” says Daniel Padon, mobile threat researcher at Check Point. “Real, state-level malware and the capability of such malware, together with large campaigns affecting millions and millions of devices, such as Gooligan and Hummingbad, are just the tip of the iceberg.”


Nvidia and Remedy use neural networks for eerily good facial animation

As showcased at Siggraph, by using a deep learning neural network—run on Nvidia's costly eight-GPU DGX-1 server, naturally—Remedy was able to feed in videos of actors performing lines, from which the network generated surprisingly sophisticated 3D facial animation. This, according to Remedy and Nvidia, removes the hours of "labour-intensive data conversion and touch-ups" that are typically associated with traditional motion-capture animation. Aside from cost, facial animation, even when motion captured, rarely reaches the same level of fidelity as other animation. That odd, lifeless look seen in even the biggest of blockbuster games often came down to the limits of facial animation. Nvidia and Remedy believe its neural network solution is capable of producing results as good, if not better, than what's produced by traditional techniques.


What’s new in Angular 5: easier progressive web apps

Its features include: An emphasis on making it easier to build progressive web apps, so apps can be cached in the browser; A build optimizer that makes the application smaller by eliminating unnecessary code. and Making Material Design components compatible with server-side rendering. The progressive web apps concept, the product of a joint effort between Google and Mozilla, is about enabling development of browser-based apps that offer a superior, native-like experience. Supporting progressive web apps in Angular today requires a lot of expertise on the developers’ part; version 5 is intended to make usage easier. “We’re shooting to try and make progressive web apps something that everyone would use,” said Brad Green, a Google engineering director.


Digital Crime-Fighting: The Evolving Role of Law Enforcement

As the cybercrime landscape continues to evolve, methods of policing it must change as well. The increasing number of cyber attacks propagated by everyone from nation-state actors to average criminals is blurring lines between cybersecurity and public safety, ultimately causing a shift in the role of government and law enforcement in protecting against these threats. Verizon's 2017 Data Breach Investigations Report notes, "In addition to catching criminals in the act, security vendors, law enforcement agencies and organizations of all sizes are increasingly sharing threat intelligence information to help detect ransomware (and other malicious activities) before they reach systems." Using their own behind-the-scenes collaboration venues, threat actors have also become increasingly well armed and well informed.


Cyber security is now mainstream business

Unfortunately, those caught in the middle of the storm are able to understand it more profoundly than the observers. While there was unprecedented large-scale impact due to the recent ransomware, it was minuscule compared to the computer infrastructure of the world. Which means that the majority of individuals and organisations would continue to remain unaware of the need for steps they should take to build an optimal cyber defence against cyber threats. That is the biggest bane of the cyber security industry and profession. The second observation is that the organisations that were impacted are building and strengthening controls around the risks of the recent ransomware attacks. That is important, but when you build cyber defence, you should consider all the possible risks to your business and build a security programme that works on mitigating these risks comprehensively.


7 unexpected ways collaboration software can boost productivity

Collaboration software can enable organizations to plan for crises and emergencies, said Michelle Vincent, collaboration and training officer for information services at Mercy Ships. The organization uses private hospital ships to provide free surgeries to residents of developing nations. Mercy Ships uses HipChat “to connect our various stakeholders during crisis drills,” Vincent said. “Hopefully, we'll never have a fire or other such emergency on the ship. But we rely on drills to keep us ready, and HipChat is a useful tool to keep us connected in real time for that purpose.” Emergency drills on the ship are performed almost every week, Vincent said, while a crisis management team’s drills involving multiple locations usually take place every quarter.


Amazon Echo hacked to allow continuous remote eavesdropping

The fact that physical access is required makes it unlikely it will happen to your Echo. It also works only on 2015 and 2016 editions of Amazon Echo devices, as they had a rubber base that can be popped off to reveal 18 debug pads. Neither the 2017 Echo model, nor the Amazon Dot, are vulnerable. If a knowledgeable attacker did have access to an older Echo, Barnes noted that rooting it is “trivial.” After rooting the Echo, the researchers wrote a script to continuously grab the raw microphone audio data. Barnes called the physical access requirement a “major limitation.” The how-to is out there now, so maybe that should give you pause before you purchase a second-hand Echo.


Cloud Data Auditing Techniques with a Focus on Privacy and Security

Nowadays, with the help of cryptography, verification of remote (cloud) data is performed by third-party auditors (TPAs).2TPAs are also appropriate for public auditing, offering auditing services with more powerful computational and communication abilities than regular users.3 In public auditing, a TPA is designated to check the correctness of cloud data without retrieving the entire dataset from the CSP. However, most auditing schemes don’t protect user data from TPAs; hence, the integrity and privacy of user data are lost.1 Our research focuses on cryptographic algorithms for cloud data auditing and the integrity and privacy issues that these algorithms face. Many approaches have been proposed in the literature to protect integrity and privacy; they’re generally classified according to data’s various states: static, dynamic, multiowner, multiuser, and so on.



Quote for the day:


"You're not always going to be successful, but if you're afraid to fail, you don't deserve to be successful." -- Charles Barkley


Daily Tech Digest - August 02, 2017

Procurement: looking ahead to a digital future

Technology is lifting the role of procurement, providing services that are critical to maintaining corporate reputation, not to mention legality. With procurement functions frequently stretching across the globe, companies need full visibility all the way to the end of a supply chain to guarantee it is free of issues such as modern-day slavery or bribery and corruption. “Automated procurement systems can help map connections in a supply chain; for a CEO, who is legally responsible for the elimination of poor processes, that puts procurement in a different light,” says Mr Coulcher. Using technology, such as the blockchain, companies can validate supply chains, bringing a level of transparency that has not always been possible. Last autumn, BHP Billiton, the mining company, announced plans to use ethereum blockchain to improve its supply chain processes.


What Is IT Governance? A Formal Way To Align IT & Business Strategy

Organizations today are subject to many regulations governing the protection of confidential information, financial accountability, data retention and disaster recovery, among others. They're also under pressure from shareholders, stakeholders and customers. To ensure they meet internal and external requirements, many organizations implement a formal IT governance program that provides a framework of best practices and controls. Both public- and private-sector organizations need a way to ensure that their IT functions support business strategies and objectives. And a formal IT governance program should be on the radar of any organization in any industry that needs to comply with regulations related to financial and technological accountability.


Design Thinking 101

Each phase is meant to be iterative and cyclical as opposed to a strictly linear process, as depicted below. It is common to return to the two understanding phases, empathize and define, after an initial prototype is built and tested. This is because it is not until wireframes are prototyped and your ideas come to life that you are able to get a true representation of your design. For the first time, you can accurately assess if your solution really works. At this point, looping back to your user research is immensely helpful. What else do you need to know about the user in order to make decisions or to prioritize development order? What new use cases have arisen from the prototype that you didn’t previously research?


Why ex-employees may be your company's biggest cyberthreat

Ex-employees are increasingly a cybersecurity risk, Maxim noted: In June, Dutch web host Verelox experienced a major outage of all of its services after most of its servers were deleted by an ex-employee, according to the company. And in April, the US-based Allegro Microsystems sued an ex-IT administrator for allegedly installing malware that deleted critical financial data. So why don't companies take away this access immediately? For one, the process can be time consuming: 70% of IT decision makers surveyed said it can take up to an hour to deprovision all of a single former employee's corporate application accounts. For another, IT and HR do not often work together, said Al Sargent, senior director at OneLogin. "This is a problem, because HR has the single source of truth for which employees are at the company and which are not, whereas IT controls access to the applications," he added.


State of Cybercrime 2017: Security events decline, but not the impact

Companies are spending more on IT security, with an average budget increase of 7.5 percent. Ten percent of respondents reported an increase of more than 20 percent. The bulk of that money is being spent on new technologies (40 percent), but companies are paying for knowledge, too, in the form of audits and assessments (34 percent), adding new skills (33 percent), and knowledge sharing (15 percent). Respondents said they were investing in redesigning their cybersecurity strategy (25 percent) and processes (17 percent) as well. Speaking of cybersecurity strategy, an amazing 35 percent of respondents said that a cyber response plan was not part of it. The good news is that 19 percent planned to implement a plan within the next year.


Kubernetes as a service offers orchestration benefits for containers

Canonical makes Ubuntu, a leading free and open source Linux server and desktop OS distribution. The Canonical Kubernetes-as-a-service distribution is a packaged deployment that stitches together additional Canonical open source projects surrounding Kubernetes, such as Juju, an application modeling framework that uses Charm scripts to simplify Kubernetes infrastructure builds; Conjure-up orchestrates these Juju script deployments. This distribution runs on various infrastructure environments, including local workstations, bare-metal servers, AWS, Google Compute Engine, Azure, Joyent and OpenStack. Canonical partnered with Google, Kubernetes' original developer, to maintain its distribution, with the aim to simplify and standardize Kubernetes clusters on just about any conceivable environment.


Black Hat 2017: Insightful, but too much hype

The industry has become far too obsessed on the zero-day problem (i.e. zero-day exploits) and isn't paying enough attention to eliminating all the manual tasks and busy work we do as cybersecurity professionals. Oh, I agree that zero-days are a problem, but these attacks are the exception. We need to get better at bread-and-butter cybersecurity operations with improved processes, automation and orchestration. In other words, people REMAIN the weakest link of the cybersecurity chain. Addressing this problem should be a high priority for all CISOs. ... New security analytics tools are expanding and challenging SIEM platforms. Software-defined tools are pushing out tried-and-try network security controls.


Machine Learning Comes To Your Browser Via Javascript

The most prominent advantages of TensorFire’s approach are its portability and convenience. Modern web browsers run on most every operating system and hardware platform, and even low-end smartphones have generous amounts of GPU power to spare. Much of the work involved in getting useful results from machine learning models is setting up the machine learning pipeline, either to perform the training or to deliver the predictions. It is very useful to boil much of that process down to just opening up a web browser and clicking something, at least for certain classes of jobs. Another advantage claimed by TensorFire’s creators is that it allows the deployment of predictions to be done entirely on the client. This won’t be as much of an advantage where both the trained model and the data are already deployed to the cloud.


Bill Would Beef Up Security for IoT Wares Sold to US Gov't

"This bill deftly uses the power of the federal procurement market, rather than direct regulation, to encourage Internet-aware device makers to employ some basic security measures in their products," Jonathan Zittrain, co-founder of Harvard University's Berkman Klein Center for Internet and Society, said in a statement announcing the bill's introduction. Security technologist and author Bruce Schneier, also in a statement, sees the legislation as a way to motivate vendors to make the investments needed to secure their IoT offerings. "The market is not going to provide security on its own, because there is no incentive for buyers or sellers to act in anything but their self-interests," says Schneier, who also is a Harvard Kennedy School of Government fellow and lecturer.


How to use Redis for real-time stream processing

Redis has become a popular choice for such fast data ingest scenarios. A lightweight in-memory database platform, Redis achieves throughput in the millions of operations per second with sub-millisecond latencies, while drawing on minimal resources. It also offers simple implementations, enabled by its multiple data structures and functions. In this article, I will show how Redis Enterprise can solve common challenges associated with the ingestion and processing of large volumes of high velocity data. We’ll walk through three different approaches (including code) to processing a Twitter feed in real time, using Redis Pub/Sub, Redis Lists, and Redis Sorted Sets, respectively. As we’ll see, all three methods have a role to play in fast data ingestion, depending on the use case.



Quote for the day:


"The most rewarding things you do in life are often the ones that look like they cannot be done." -- Arnold Palmer


Daily Tech Digest - August 01, 2017

Understanding the critical role of 'first receiver' in IoT data

The role of the first receiver is first and foremost to allow for the effective leveraging of the utility value of the IoT data. At the center of this is the role of governance and data primacy. The first receiver assumes, but does not necessarily dictate, that the owner of the IoT data will be the enterprise that owns (or minimally controls) the IoT subsystems. Noting that a single record generated from a single sensor message can be utilized by a variety of constituents, internal or external, in a variety of different ways, the aim is to securely and cost-effectively provide a mechanism for allowing the right people/organizations to access and use the right data in the right place at the right time. While the first receiver should also have several other functions associated with it, the key is enabling the right architecture for leveraging the underlying data.


The most valuable cloud computing certifications today

As the cloud becomes more critical to IT and the business at large, demand for cloud skills will only grow. But proving you have the right skills and knowledge in a competitive job market can be difficult. If you’re looking for an extra edge in landing a new job or promotion, cloud certification can be a great option. Certifications measure knowledge and skills against industry benchmarks to help you prove to employers that you have the right mix of cloud skills, knowledge, and expertise. If you’re looking for more general, across-the-board knowledge, a vendor-neutral certification can provide a broad overview of key concepts and foundational expertise. If you’re looking to specialize, whether in your current job or because you’re angling to land a new role, consider specializing in one or more vendor-specific certifications, such as AWS or VMware.


Public Company to Convert Bitcoin to Stock in First-of-Its-Kind Fundraise

Since first posting the terms of its convertible bitcoin loan online, DigitalX CEO Leigh Travers told CoinDesk he's been contacted by several "investor relations departments of major blockchain companies" with questions about how they can do the same. It turns out, it's not actually that difficult. But, to understand how the bitcoin loan was executed, it's important to grasp a bit about DigitalX's unusual background. Back in March 2014, DigitalX became Australia’s first listed bitcoin company following the reverse takeover of "dying oil and gas firm" Macro Energy. The cash-rich company acquired DigitalX (then called Digital CC), but gave more shares to the acquiring company than the original shareholders had, resulting in a shift of control.


The Next Big Technology Could Be Nanomaterials

Some researchers think nanomaterials could be useful in mitigating the greenhouse effect and thus slowing or stalling climate change. Scientists at the University of Texas at Dallas have developed a honeycomb-shaped structure built from nanoscale molecules that can trap carbon emissions in the atmosphere. To keep gases from leaking out of the device, the researchers capped its outer surface with vapors of a one-nanometer molecule called ethylenediamine. This protective layer was an attempt to mimic the way bees seal parts of their hives with wax. Another application, developed at Rice University, uses sheets of graphene to recycle waste carbon dioxide into ethylene and ethanol. In a lab test, graphene showed the potential to reduce carbon dioxide by up to 90 percent, converting 45 percent of greenhouse gases into clean fuel. Potential healthcare uses include silicon nanowires that can be absorbed by human cells.


Inside Wells Fargo's cybersecurity war room

In the past, banks have primarily run paper or desktop cyberattack simulations that focused on who would call whom in the event of a cybercatastrophe. “It’s now emerged to a cyber range, a cybersimulation that allows those cyberwarriors to respond to real-life infections and malware, strengthen their skills, improve the controls in their environment and get ready for what may come one day from a malicious or nation-state actor,” Baich said. “In a cyber range, you take real action and, since it's a virtual environment, it will not impact production systems.” Most large banks are using cyber ranges. But the technique has yet to reach the midtier and smaller banks. Chris Thompson, senior managing director and head of financial services cybersecurity and resilience at Accenture Security, said many organizations can’t afford it.


Developing The Next Generation Of Transport

First, the vehicles are becoming more complicated with many more interconnected systems that all have to be operated precisely to deliver the performance and response demanded by customers. To deal with this, the detail of the models being used on the HiL test rigs needs to be increased so that they cover all the vehicle systems including the engine, electric motors, cooling, suspension and so on. These complex multi-domain models need to run in real time with the appropriate level of detail to feed the control systems. The second challenge is that to make a vehicle autonomous, it must be able to see and interpret its surroundings and this is typically done through a range of sensors including cameras, radars, LiDARs (light detection and ranging) and ultrasound sensors.


The Power Of Digital Transformation In A Data-Driven World

In a data-driven world, an organization can rethink many of its old assumptions. When Airbnb, for example, broke away from processes and focused on data, it realized the company doesn’t need to own physical assets (hotels). Aspects of a hotel business that made it competitive in a process-driven world get stood on their head in a data-driven world. People who have apartments in great locations are a different option than hotels and provide different value in the customer experience. In rethinking old assumptions about a business, we can get to where value or opportunities emerge in different places than in a process-defined world. Another example of rethinking assumptions in a data-driven world is the human resources processes.


Data is eating the world: How data is reshaping business in the networked economy

The first thing to understand as we transition to a world dominated by data and networks (networks being any collection of people, nodes or data that is linked together) is this will not be a bigger version of mobile. It is fundamentally different. And requires a new sensitivity and instinct for how connected systems work. The recent cyber/data attack on the NHS in the UK (and others) point to a fracture in the old ideas and institutions we have come to rely on. Data has now become the great fault line of business, and almost everything we know and have learnt about digital, technology, and in many cases commerce, must be left behind us. We have heard a lot about digital transformation over the past 10 years, as companies grapple to maintain relevance in a mobile-dominated world.


Culture for a digital age

Executives must be proactive in shaping and measuring culture, approaching it with the same rigor and discipline with which they tackle operational transformations. This includes changing structural and tactical elements in an organization that run counter to the culture change they are trying to achieve. The critical cultural intervention points identified by respondents to our 2016 digital survey—risk aversion, customer focus, and silos—are a valuable road map for leaders seeking to persevere in reshaping their organization’s culture. The remainder of this article discusses each of these challenges in turn, spelling out a focused set of reinforcing practices to jump-start change.


Adopting AI in the Enterprise: Ford Motor Company

The most common AI applications involve direct driver interaction, including advisory systems that monitor acceleration and braking patterns to provide on-board evaluations of a driver’s preferences and intentions for different purposes—characterization of the driver, advice for fuel-efficient driving and safe driving, auto-selecting the optimal suspension and steering modes, simplifying the human-machine interface by estimating the most likely next destination, and preferred settings of the climate control, etc. These systems use traditional AI methods—rule-based, Markov models, clustering; they do not require special hardware. One of their distinctive features is to be intelligent enough to identify the level of acceptance of provided recommendations, and avoid drivers’ annoyance.



Quote for the day:


"What is important for a leader is that which makes him a leader. It is the needs of his people." -- Frank Herbert