Daily Tech Digest - June 27, 2017

8 Ways Millennials Impact Your Security Awareness Program

Millennials are the first generation for whom computer devices are ubiquitous in their daily activities. Consider that laptops have become the computer of choice and can be taken anywhere. Cellphones are more powerful and functional than computers were a decade ago — and millenials have had these devices in their pockets for as long as most of them can remember. But use of a technology does not mean that it is safely used and millennials' comfort with technology does not mean that they are more security aware. The tendency is to use technology in a way that is most convenient, not most secure. And while there has been some effort to protect their privacy — primarily from their parents and others — this does not mean that they are aware of all the things there are to protect and how to protect them. The fact is, the more information that is available, the more vulnerable it is made.

Tools that increase IT efficiency pave the way for digital transformations

Content is core to the work of Densho, an organization whose mission is to preserve the testimonies of Japanese Americans who were held in internment camps during World War II. In the past, Densho needed a complex storage environment to house its 30TB of production data, says Geoff Froh, deputy director and CIO at the nonprofit organization based in Seattle. “The two-tier infrastructure was composed of high-performance SAN hardware and high-capacity consumer-grade NAS appliances. The SAN was expensive, difficult to manage and not scalable. The NAS gear was unreliable and lacked the IOPS to handle our workload,” Froh recalls. Densho turned to storage start-up Qumulo, which aims to help enterprises store their data more efficiently and with greater visibility into how content is being used.

Good managers give constructive criticism—but truly masterful leaders offer constructive praise

Most leaders “vastly underestimate the power and necessity of positive reinforcement,” Jack Zenger and Joe Folkman, CEO and president of Zenger/Folkman, write in Harvard Business Review. An abundance of research shows that giving positive feedback increases employees’ sense that they’re learning and growing at their jobs, makes them feel valued, and leads to increased confidence and competence. A 2015 Gallup survey found that 67% of employees whose managers communicated their strengths were fully engaged in their work, as compared to 31% of employees whose managers only communicated their weaknesses. One study found that high-performing teams receive nearly six times more positive feedback than less effective teams—evidence that positive reinforcement really does help the bottom line.

A massive cyberattack is hitting organisations around the world

Many of the initial reports of organisations affected came from Ukraine, including banks, energy companies and even Kiev's main airport. But since then more incidents have been reported across Europe, indicating the incident is affecting more organisations more widely. The National Bank of Ukraine said it has been hit by an "unknown virus" and is having difficulty providing customer services and banking operations as a result, while Kiev's Boryspil International airport is also understood to be suffered from some kind of cyber attack. Ukraine's Interior Ministry has already called the cyberattack the biggest in Ukraine's history. Danish transport and energy firm Maersk has confirmed that its IT systems are down across multiple sites due to a cyberattack, while Russian petroleum company Rosneft has reported a "massive hacker attack" hitting its servers.

How Proper Offboarding Can Help Prevent Data Breaches

"We need to move beyond having a key card or simply taking away people's keys," Hoyas added. "That's not effective nowadays because we have a very mobile workforce." Employees use mobile phones, work remotely on laptops, and log in to company systems from their own computers through shared drives or the cloud. "You need to manage your employees wherever they exist and wherever they log in from," he said. "Users log in from home, from their office and they can log into apps and e-mails from their own devices. Most of the time companies aren't paying for people's cellphones," he pointed out. Employers should keep that in mind when an employee leaves and they must cut off access to his or her computer, Hoyas said.

Under pressure, Western tech firms bow to Russian demands to share cyber secrets

The demands are being made by Russia’s Federal Security Service (FSB), which the U.S. government says took part in the cyber attacks on Hillary Clinton’s 2016 presidential campaign and the 2014 hack of 500 million Yahoo email accounts. The FSB, which has denied involvement in both the election and Yahoo hacks, doubles as a regulator charged with approving the sale of sophisticated technology products in Russia. The reviews are also conducted by the Federal Service for Technical and Export Control (FSTEC), a Russian defense agency tasked with countering cyber espionage and protecting state secrets. Records published by FSTEC and reviewed by Reuters show that from 1996 to 2013, it conducted source code reviews as part of approvals for 13 technology products from Western companies. In the past three years alone it carried out 28 reviews.

UX is Grounded in Rationale, not Design

Sketching out things is great as it allows you to visualize and conceptualize something, but don’t sketch solutions without understanding of problem. You will end up boxing in your thought process too early if you do that. Though some places may say that sketching in the beginning is good, you could be using your time to distill information and create a solid framework of the work you are trying to do. ... Without building a rationale behind the problem, my reasoning behind my design decisions would end up being part of a non-existing framework I didn’t have to support them. The things I built wouldn’t be as effective if I had just focused on making sense of my research in the beginning.

Building a Blockchain PoC in Ten Minutes Using Hyperledger Composer

Hyperledger Composer, one of the Hyperledger projects hosted by The Linux Foundation, aims to solve this problem by making it easy for blockchain developers to model business assets, participants and transactions and to turn these models into viable blockchain applications. Hyperledger was set up in December 2015 as a collaborative effort to advance cross-industry open-source blockchain technologies for business. It is the fastest growing project in Linux Foundation history and the Hyperledger umbrella currently includes several technologies, from blockchain frameworks such as Hyperledger Fabric and Hyperledger Sawtooth to tools that provide services such as monitoring, identity, development and deployment. Hyperledger Composer is one of these tools.

26 Tools and Frameworks for HTML-based Desktop and Web App Interfaces

If Angular 2 development is your thing, check out Kendo UI for Angular 2, and all-new version of Kendo UI built with TypeScript, JavaScript, and NativeScript -- no jQuery dependencies! Kendo UI for Angular 2 is in Beta as we go to press. Licensing details will be announced along with the V1 release in 2017. Kendo UI professional is available with a free trial version and per-developer, royalty-free licenses at several tiers providing access to additional Telerik developer resources. jQuery UI is another option for building HTML and JavaScript-based application interfaces. It's completely open source and has the advantage of being directly compatible with jQuery, jQuery Mobile, the QUnit JavaScript unit testing framework, and the Sizzle pure-JavaScript CSS selector engine, all directed and licensed by the jQuery foundation.

Windows Seerver Gets The Fast Train

Nano as a container image made for a good strategic fit, Gaynor opined, with the every-six-month upgrade pace justified by the tempo of containerization. "Just look at what's happened with containers in the last five years," he said. Meanwhile, making Server Core available as either always-changing or static also "made sense" to Gaynor because it had taken the place of Nano as the default smaller-footprint installation. The faster tempo lets aggressive customers "have their cake and eat it, too," said Gaynor. Cumulatively, those twice-annual upgrades will compose the feature set of the next Windows Server X. In two or three years, Microsoft will put a stake in the virtual ground by christening Windows Server 2018 or Windows Server 2019, built by the iterative process of shipping Server Core updates.

Quote for the day:

"A positive attitude will not solve all your problems. But it will annoy enough people to make it worth the effort " -- Herm Albright

Daily Tech Digest - June 26, 2017

12 'best practices' IT should avoid at all costs

Legitimizing the idea of internal customers puts IT in a subservient position, where everyone in IT has to make their colleagues happy, whether doing so makes sense for the business or not, let alone whether it encourages the company’s actual customers to buy more products and services. ... Want to do some damage? Establish formal service level agreements, insist your “internal customers” sign them, and treat these SLAs like contracts. And if you really want IT to fail, argue about whether you’ve satisfied your SLAs every time an “internal customer” (there’s that word again) suggests IT isn’t doing what they need it to do. It’s a great way to keep relationships at arm’s length.

Bill Gates and Digitization: Ahead of the Curve Yet Again

While we’ve had elements of a digital supply chain for quite some time, in this more holistic sense of a digital nervous system, we are only beginning to scratch the surface. A nervous system can take our sensory inputs – sight, sound, touch, taste, and smell – and a person can react either instantly or more thoughtfully to what is happening around them. While a WMS is a digital supply chain application, it has a limited scope in how it is using sensor data. It certainly does not react in the holistic way that a nervous system does. There has been an explosion of new sensor data available to be used to create digital supply chains. We are using, or learning to use, SNEW data – social media, news, event, and weather data 

Key Abstractions for IoT-Oriented Software Engineering

The term "IoT system" generally refers to a set of IoT devices and the middleware infrastructure that manages their networking and interaction. Specific software can be deployed logically above an IoT system to orchestrate system activities to provide both specific services and general-purpose applications (or suites of applications). Providing specific services means enabling stakeholders and users to access and exploit things and direct their sensing or actuating capabilities. This includes coordinated services that access groups of things and coordinate their capabilities. For instance, in a hotel conference room, besides providing access to and control of individual appliances, a coordinated service could, by accessing and directing the lighting system, the light sensors, and the curtains, change the room from a presentation configuration to a discussion configuration.

Cybersecurity: The New Normal

Today, cybersecurity is high on everyone’s radar, as a powerful new reality that is penetrating all facets of cyberspace. On a near-daily basis we read of damages to hardware, software, content, products, processes.. No one is immune. No one is safe. This new reality — with the variety of threats, exploits and damages that seemingly multiply day by day — creates new markets, new business opportunities, new strategic concerns and threats to our collective views of law and order. These elements are shaping a new normal which is not yet fully understood. But they are clearly anchored in the nature of the hardware, ever changing uses and functions enabled by evolving software and fueled by the power of human ingenuity. When the Internet was designed, threats to security were not central to the basic architecture nor to the core design principles.

Companies are wasting massive amounts of money on ineffective security solutions

The survey also found that massive amounts of time and money are wasted on ineffective endpoint security solutions and lack of endpoint visibility and control is a major issue. Ineffective overall endpoint security protection costs an average of $6 million in detection, response, and wasted time. Only 27% of survey respondents have confidence that their company can identify the endpoint devices which pose the greatest risk in a highly effective fashion. Worse, 20% reported having no endpoint security strategy at all. On average, according to the report, companies spend over 1150 hours on a weekly basis attempting to detect and contain insecure endpoints, which represents a cost of $6 million spent detecting and containing insecure endpoints or suffering unplanned downtime. Nearly half of those hours are spent chasing false positives, which equates to $1.37 million of annual wasted expenditures.

How to handle risks of hypervisor hacking

First, hold virtualization implementers to high standards. We have learned a lot in the last few decades about development methodologies that reduce defects and quickly detect and remediate defects that make it through development and into production. When consistently practiced, DevOps, the methodology that removes the traditional boundaries between development, deployment, and production, and embraces continual improvement, has greatly increased system reliability. Hypervisor implementations have fared well. Although potential exploits have been found, the hypervisor developers have also been diligent about fixing problems. This has kept the number of actual malicious exploits low. However, developers make mistakes and diligence is not absolute protection. Some flaws always creep in.

7 reasons to switch to microservices — and 5 reasons you might not succeed

With microservices, your code is broken into independent services that run as separate processes. Output from one service is used as an input to another in an orchestration of independent, communicating services. Microservices is especially useful for businesses that do not have a pre-set idea of the array of devices its applications will support. By being device- and platform-agnostic, microservices enables businesses to develop applications that provide consistent user experiences across a range of platforms, spanning the web, mobile, IoT, wearables and fitness tracker environments. Netflix, PayPal, Amazon, eBay, and Twitter are just a few enterprises currently using microservices.

MicroStrategy CEO Michael Saylor speaks about ‘expanding universe’ of BI

Every company has to decide where to make its investments. Some BI company might come along and say “we are the best for the Hortonworks distribution of Hadoop”, and that might fly for a while. But I have to say I have been in this business for 27 years and every three years there is a new data technology which is the rage. I remember one that was billed as the world’s fastest database, and I asked one of their sales people what was in the next release, and he said “joins”. That’s a colossal joke because there is no serious problem that you can solve without doing table joins. So, yes, as long as you don’t need to ask the next question or need mathematics or need more than two users to run a query, it’s super-fast and great.

Self Contained Systems (SCS): Microservices Done Right

Finding Bounded Context can be done by grouping user stories together. So for example searching for products by full-text search, by categories or by recommendations might be part of the same Bounded Context. Of course the split is not clear-cut - depending on the complexity the search might be split into multiple Bounded Contexts. Also a user journey might provide ideas about a split into SCSs. The customer journey describes the steps a customer takes while interacting with the system e.g. search for products, check-out or registration. Each of these steps could be a candidate for a SCS. Usually these steps have little dependencies. Oftentimes there is a hand-over between these steps: The shopping cart is handed over to the checkout where it becomes an order, and is then handed over to fulfillment.

Using supercomputing to attract research and investment

This enables the team, led by Dr. David Matthews, Senior Lecturer in Virology at the University, to examine how the virus had evolved over the previous year, informing public health policy in key areas such as diagnostic testing, vaccine deployment and experimental treatment options. This complex data analysis process took around 560 days of supercomputer processing time, generating nine thousand billion letters of genetic data before reaching the virus’ 18,000 letters long genetic sequence for all 179 blood samples. This is just one of many examples of how HPC at the University is contributing to significant research projects. Now in its 10th year of using HPC at Bristol, each phase from the first supercomputer through to BC4 has been bigger and better than the last and, in years to come this trend will definitely continue.

Quote for the day:

"Once you've accepted your flaws no one can use them against you." -- George R.R. Martin

Daily Tech Digest - June 25, 2017

7 Disruptive Technologies Destined To Change The World

Before 2020, fully autonomous vehicles will become a fixture on our highways and not long after, autonomous taxi networks will experience unprecedented growth that will radically transform the nature of travel and transportation, with a corresponding boost in productivity. Autonomous travel, costing only half as much as driving a personal car, will drive car sales down. The decline in battery costs will make electric vehicles (EVs) more preferable to gas-powered vehicles because it will be far less costly to own an EV. This will lead to widespread adoption of EVs and companies like Tesla will stand to gain the most ... Although it is the auto industry that might have driven the sale of industrial robots, it’s now far from being the only industry that employs the use of this technological innovation. Especially as capital and programming costs continue to decline, manufacturing companies will benefit more from employing robots and automating more of their processes.

Why blockchains fail and decentralization succeeds

With all of the excitement around blockchain technology, it’s easy to think what we have now is the foundation for the next wave. Yet, it’s worth remembering we are still in the early stages. The blockchains we have today probably won’t be the blockchains of tomorrow. ... It also has a lot of technical questions that surround it. As Muneeb Ali of Blockstack said, “At scale, Ethereum is designed to fail” — though he was quick to add that there’s always room to make changes in the future. He didn’t mean, “it will intentionally fail.” However, if you think about the nature of blockchains — everyone has a copy of the ledger, which these days is about a 100GB download. Furthermore, in the case of Ethereum, ever more third-party applications and sub-economies are being launched to run on top of it, and all of that code runs on the distributed network too. So it makes sense to start asking questions.

Microsoft: No, It's Not An Audit

Because Microsoft and its partners offer fee-based SAM services, concerns on the part of customers about their practices could easily dampen enterprise enthusiasm for the evaluations, and thus reduce revenue from SAM programs. And Microsoft clearly sees SAM as a money maker for its partners. "The SAM opportunity in enterprise has never been bigger. Learn about Microsoft's plan for enterprise and industry accounts, and how you can build new revenue streams with SAM," states a description of one of several SAM-related sessions listed on the schedule for the upcoming Inspire conference in Washington, D.C. July 9-13. Microsoft Inspire is the renamed Worldwide Partner Conference, long the yearly massive meet-up of the firm's global partner network, on which Microsoft relies for much of its software and services sales.

Finding data relationships with intelligent graph analytics

In an RDF data store, we can pre-define the schema models - called Ontologies - as well as load new dataset as they come in. So, instead of spending enormous amount of time in creating the data model, we started out with a standard – Financial Industry Business Ontology (FIBO) model and decided to extend it as we encounter a new set of data. The expense involved with mastering custom code was avoided through the use of RDF Graph DB features. We could load multiple datasets into RDF Graph DB, as they are maintained in the source system without creating special extract files. The connections happen at the database at the attribute level between multiple domains as well as with transaction data. The major mindset change required is to not process master and transaction data separately and then build dimensional model, but to build an integrated RDF Graph DB where they can co-exist and fully connected through a single set of processes.

Cybercrime industry growing rapidly, cybersecurity can't keep up

"It's a constant game of cat and mouse between the defenders and the attackers," Maor noted. With technology constantly changing, security has a tough time keeping up. Maor explained that the security industry moves significantly slower than the cybercrime industry because there are no regulations for cybercrime. Maor said it's imperative for people to change how they approach security. Companies are not doing basic things to protect themselves from cybercrime, they need to have backups in place and always be prepared, Maor furthered. The mindset around cybersecurity and cybercrime must shift. Businesses need to run under a "when will I get hacked" instead of an "if I get hacked" mentality, making security more of a priority than expediency to release a product.

The next industrial revolution is upon us … and many don’t even realize it.

As we enter the Fourth Industrial Revolution, rapid and unpredictable shifts in technology will present both challenges and opportunities. The sheer volume of available data in the new world could fundamentally change the way society operates by developing previously unthinkable solutions to problems we didn’t know existed. Digitization of everyday things, when coupled with the ability to self-enhance through artificial intelligence, will drive significant change in the global economy.  Failure to prepare for and respond to digitization in the Fourth Revolution will be costly, especially as new market entrants test and evolve. The dramatic rise and fall of video rental giant, Blockbuster, is a poignant illustration of how digital innovator, Netflix, overtook the $5 billion incumbent by gradually siphoning off its customer base.

The Revolution Will Begin Eventually (Maybe): AI and Recruiting

Evaluating motivation is about improving sourcing, which is typically a low-yield, labor-intensive business. Every recruiter knows that reaching out to candidates who have not applied often produces few results because of low response rates. However, a machine learning system can identify people who are more likely to to consider a solicitation for a job; in other words, those who are more motivated to change jobs or accept a new one. There’s an abundance of data on social networks and other places that can be tapped for this purpose. For example, Google’s Timeline tracks your every move (check it out) and can be used to accurately determine a person’s commute. A candidate with a long commute is more likely to respond to a solicitation than someone who has a short one, especially if the former travels through heavy traffic.

India Sees a Significant Rise in Data Breach Cost

The increased speed of these cyber incidents allows for more such attacks to occur, and Shahani suggests that has an had adverse impact on organizations' bottom line. "The penalty is huge as the cost of data breach incidents for companies in India and Asia [and] is significantly increasing this year from what was observed during the previous year," Shahani says. According to the study, the cost of a data breach in India this past year increased by 12.3 percent. The cost of lost or stolen records in the past year rose by 12.8 percent. The study cites malicious or criminal attacks, insider negligence and system glitches as the root causes of data breaches and that, Shahani says, makes a huge impact on the cost, besides the time to detect and contain the incident.

Multigenerational workforces: 6 ways to foster digital change

Digital transformation is not all about tools or technology—it’s about people too. Today's workplaces are becoming increasingly multigenerational. Older employees are staying in the workforce longer and mixing with younger colleagues who are just starting their careers. As such, the range of ages in the workplace is naturally expanding. A recent survey from executive development firm Future Workplace and Beyond, The Career Network, found that 83 percent of respondents have seen millennials managing Gen X and baby boomer workers in their office. However, 45 percent of baby boomers and Gen X respondents said millennials lack managerial experience, which could have a negative impact on a company's culture. More than a third of millennial respondents said managing older generations is challenging.

The inextricable link between IoT and machine learning

Optimizing computational cost of the machine learning model like all other use cases there is a trade-off between accuracy and image resolution.  Also the lower the resolution that optimizes accuracy, the shorter the flight time of a drone to criss-cross a field and the longer the battery life. In addition to saving the time and cost of deploying IoT devices and networks to interconnect them, machine learning could be a separate path to confirm an IoT system is working. A critical IoT device could fail and report a false condition. For instance, IoT sensors might fail to report critical conditions such as a fire, an unauthorized person entering or a door left open, but a machine learning model sampling a video feed could recognize the critical condition, all as adaptations of Resnet 50 or another convolution network.

Quote for the day:

"The obvious is that which is never seen until someone expresses it simply." -- Khalil Gibran

Daily Tech Digest - June 24, 2017

Compliance Does Not Always Cure Health Care Security Woes

According to a Level 3 report, “cyberthreats and the security landscape evolve rapidly, and industry standards cannot keep pace.” Compliance standards can only reflect best practices as of the time when the draft standards were approved. But because of the rapid evolution of the technology environment, best practices are a fast moving target. Today’s networks are liable to have far more endpoints than what was typical even a few years ago. Indeed, the contemporary focus of security thinking is shifting from primarily endpoint protection to an emphasis on trust of specific users and devices. The current compliance framework only imperfectly reflects this very recent development. In health care, we are now moving from mere mobile connectivity to the Internet of Things (IoT) and connected devices.

Will Technology Destroy Your Company or Make You a Fortune?

While corporate boards everywhere may be beginning to feel the cold wind blowing from the startup sector, it is possible for established companies to stay relevant—and even enjoy success—in the arenas of FinTech and the Internet of Things. It begins, of course, with the correct mindset: leadership must be willing to be bold and drive innovation with strategy. For example, Collins and Porras highlighted the importance of the Big Hairy Audacious Goal (BHAG), such as that determined by JFK in his desire to put a man on the moon in less than a decade. A BHAG is vital in the face of such disruptive technological change. The key to corporate survival in the FinTech and IoT arenas lies in the open-minded approach of embracing innovation and preparing for changing technologies. However, the fact remains that large, mature companies face unique challenges when it comes to innovation.

How HR can lead digital transformation

There is no denying that digitalisation means change. Many people fear change, an attitude which could halt, or even thwart, progress. The root of this fear is the perceived loss of control – by moving to something new, people will feel less in control. Processes which took no time at all, or they were comfortable with, will now take a bit longer, with new ways of working to be learnt. This can be disheartening, particularly during any transition period. The best advice is to take small steps. Take on one project (for instance, look at your own HR services – could you digitalise part of your HCM by moving it into the cloud, removing a legacy system and potentially reducing cost?). Identify your roadmap, what success looks like and then take the plunge from the low diving board. The high diving board will still be there when you, and your organisation, are confident enough to climb.

Microsoft says 'no known ransomware' runs on Windows 10 S -- so we tried to hack it

Although Hickey used publicly known techniques that are widely understood by security experts, we nevertheless privately informed Microsoft's security team of the attack process prior to publication. For its part, Microsoft rejected the claims. "In early June, we stated that Windows 10 S was not vulnerable to any known ransomware, and based on the information we received from ZDNet that statement holds true," said a spokesperson. "We recognize that new attacks and malware emerge continually, which is why [we] are committed to monitoring the threat landscape and working with responsible researchers to ensure that Windows 10 continues to provide the most secure experience possible for our customers." This hack may not have been the prettiest or easiest to launch. You could argue that the hack took too many steps that wouldn't be replicated in the real world,

How Artificial Intelligence will impact professional writing

Now, Artificial Intelligence is making inroads in the field by providing smart summaries of documents. An AI algorithm developed by researchers at Salesforcegenerates snippets of text that describe the essence of long text. Though tools for summarizing texts have existed for a while, Salesforce’s solution surpasses others by using machine learning. The system uses a combination of supervised and reinforced learning to get help from human trainers and learn to summarize on its own. Other algorithms such as Algorithmia’s Summarizer provide developers with libraries that easily integrate text summary capabilities into their software. These tools can help writers skim through a lot of articles and find relevant topics to write about. It can also help editors to read through tons of emails, pitches and press releases they receive every day.

Machine Learning Enhances Processes Greatly

Artificial intelligence (AI) has arrived in a big way, with corporations leveraging it to unleash new value-addition avenues for their businesses. This is seen in the statistics too—according to a PwC survey, 30% of the industry believes that AI will be a key disruptor to business within the next five years. Machine learning, or AI, is backed by hard-core logical data, enabling leaders to take better decisions. It finds numerous applications from process optimization to growing the top line, and is set to be the rage in the industry.  Companies today are still experimenting with machine learning technologies at various stages, with some of them driving pilot projects and others integrating it into their mainstream processes. At any stage, machine learning should be seen as translating into value-add to the customer.

Google’s New AI Is Better at Creating AI Than the Company’s Engineers

AutoML has the potential to impact many of the other AI and machine learning-driven softwares that were discussed at the conference. It could lead to improvements in the speech recognition tech required for a voice-controlled Google Home, the facial recognition software powering the Suggested Sharing feature in Google Photos, and the image recognition technology utilized by Google Lens, which allows the user to point their Google Phone at an object in order to identify it. Truly, AI has the potential to affect far more than just our homes and phones. It’s already leading to dramatic advancements in healthcare, finance, agriculture, and so many other fields. If we can use an already remarkable technology to actually improve that same kind of technology, every advancement made by humans can lead to machine-powered advancements, which lead to better tools for humans, and so on.

Know the Flow! Microservices and Event Choreographies

Central to the idea of event collaboration is that all microservices will publish events when something business relevant happens inside of them. Other services may subscribe to that event and do something with it, e.g. store the associated information in a form optimal for their own purposes. At some later point in time, a subscribing microservice can use that information to carry out its own service without being dependent on calling other services. Therefore, with event collaboration a high degree of temporal decoupling in between services becomes a default. Furthermore, it becomes easy and natural to achieve the kind of decentral data management we look for in a microservices architecture. The concept is well understood in Domain Driven Design, a discipline currently accelerating in the slipstream of microservices and the "new normal" of interacting, distributed systems in general.

The Future of Work: Death of the Single Skill Set In The Age Of Automation

This death of the single skill set has been documented by David Deming, associate professor of education and economics at Harvard University. Dr. Deming argues that many jobs requiring only mathematical skills have been automated, but roles which combine mathematical and interpersonal skills (such as economists, health technicians, and management analysts) will be in demand. These findings are reinforced by a study conducted by Business Higher Education Forum and Gallup that examined the percent of employers who say both data science and analytical skills will be required of all managers by 2020. As noted in the chart below, this is predicted to be true for managers who span the functions of finance, marketing, operations, supply chain and Human Resources.

Divvying up must-do secure application development list for dev and ops

It's a very different security situation with microservices for a couple of reasons. One reason is that microservices tend to be used because organizations want to iterate and deploy applications very, very quickly. And when they're doing very, very quick deployments, then that can compound the security challenge because you can't afford to run lengthy security procedures against every single deployment. So that's one challenge to consider. Another challenge to consider is that if an individual microservice were to be exploited in some way, then that could provide a gateway into the entire application. So it's very important to track and manage all of the traffic coming in, to control which microservices can be directly addressed by external parties and to focus on them with your security efforts to ensure that they are very, very difficult to compromise and they have limited privileges should they be compromised.

Quote for the day:

"It does not suffice to hone your own intellect (that will join you in your grave), you must teach others how to hone theirs." -- EW Dijkstra

Daily Tech Digest - June 23, 2017

Where to spend your next security dollar

You probably haven’t thought about NACD for cyber security training. But, the program is the best security management course I have seen, is online and will give your senior executives a great overview of what your organization needs to be doing about security and risk management. The course describes the security management function and is general in scope, not compliance focused. If your executives participate in this training, they (and you) will have an excellent idea of the essential practices your organization needs to follow. The program connects security practices with business issues and language. I don’t have anything against my ISC2 and ISACA training courses, but their roots are in technology and audit. This training’s roots are in business.

Lightworks 14 review: Free video editing software lacks proper Mac decorum

With version 14, developer EditShare has taken great strides to make the Lightworks more consumer-friendly, consolidating the previously modular user interface into a fixed, full-screen workspace. (The flexible “classic” mode is still available from the Project Layout settings.) With the organized, single-window UI comes an easier to use application, but Mac users won’t feel quite at home. For starters, there are no menu options at all, and Lightworks shuns Apple’s traditional contextual menu shortcuts in favor of the Windows right-click approach. Likewise, the file browser has a distinctly Unix look and feel that makes macOS seem like a second-class citizen. Coming from years of experience with native Mac editing software, the transition was a bit jarring to say the least.

Atomistic and Holistic View of an Enterprise

Enterprises are complex adaptive systems where a complex adaptive system is defined as systems that are characterized by complex behaviors that emerge as a result of non-linear interactions in space and time among a large number of component systems at different levels of organization. That is a view of the enterprise arrived at by breaking it down into smaller units of organization may be useful in comprehending each individual part and how they fit into the larger whole, but it will not lead to a holistic understanding of the enterprise itself. To use an analogy, for example, if you get two cars, one from UK and the one from US, and break it apart and understand it in terms of their components, that analysis might answer some questions about the functioning of these cars, but that analysis alone will not tell you why one has the steering wheel on the left side and the other has it on the right side.

Automation And Society: Will Democracy Survive The Internet Of Things?

When the internet first went live, many commentators assumed it would provide a pure form of democracy. Everyone was given the same platform; age, race and gender were no longer relevant and we were all anonymous. But the reality was more chaotic, as we struggled to comprehend the power of a new tool that would revolutionize human life. This strange world was somewhere we could get lost and detach ourselves from everyday existence, but often it was also a quasi-reality that proved overwhelming and dangerous. Yet slowly we have found structure. Society has become inherently more intelligent - we can find the answer to almost anything at the click of a button. Those at the cutting-edge can now gain previously unimaginable insight into human tendencies and interests.

Stay out of the hot seat with turnkey private cloud

When it comes to implementing a multi-cloud strategy, your approach to private cloud can have a dramatic effect on your organization’s results. When compared with a DIY private cloud approach, implementing a turnkey private cloud will dramatically reduce the friction you experience. Less static friction with a turnkey private cloud means that your strategy will be implemented faster, accelerating time-to-value. Less dynamic friction with a turnkey private cloud means that ongoing cost and risk will be reduced, resulting in improved service levels and a better bottom line. Less friction means less heat. Keep yourself out of the hot seat and adopt a turnkey private cloud.

A new release management strategy depends on speed and efficiency

Dark launching is a similar process. Software is gradually and stealthily released to users in order to get their feedback as well as to test performance. Code is wrapped in a feature toggle that controls who gets to see the new feature and when. Facebook and Google rely on dark launches to gradually release and test new features to a small set of users before fully releasing them. This approach lets operations staff determine if users like or dislike the new function. It also allows for an assessment of system performance before moving ahead with a full release. As these different delivery options emerge, companies are looking for ways to train and familiarize their staff as part of a new software deployment strategy.

What it takes to be a security incident responder

The skills needed for a quality incident responder can be categorized into two main groups: personal skills and technical skills. “The greater one’s technical skills, the better the incident responder,” Henley says. Among the desirable skills are a good grasp of basic security principles such as confidentiality, authentication, access control and privacy; security vulnerabilities; physical security issues; protocol design flaws; malicious code; implementation flaws; configuration weaknesses and user errors or indifference. Responders should also know about the Internet of Things (IoT), risk management, network protocols, network applications and services, malicious code, programming skills and intruder techniques. IT security professionals who become leaders or members of response teams sometimes take circuitous routes to these positions.

5 ways businesses can cultivate a data-driven culture

The pressure on organizations to make accurate and timely business decisions has turned data into an important strategic asset for businesses. In today’s dynamic marketplace, the ability for businesses to use data to identify challenges, spot opportunities, and adapt to change with agility is critical to its survival and long-term success. Therefore, it has become an absolute necessity for businesses to establish an objective, data-driven culture that empowers employees with the capabilities and skills they need to analyze data and use the insights extracted from it to facilitate a faster, more accurate decision-making process. Contrary to what many people think, cultivating a data-driven culture is not just a one-time transformation. Instead, it’s more like a journey that requires efforts from employees and direction from both managers and executives.

The fight to defend the Internet of Things

One job of the IoT ecosystem, including technology, products, and service providers, is to protect millions (or even billions) of other people by introducing robust security capabilities into the wide variety of connected devices shipped everyday. A robot or IP camera might require advanced computer vision and data processing power, while a connected light bulb may only need basic connectivity and a simple microcontroller. But they all need to be protected. Security needs to be considered in every aspect of the IoT, whether that’s the device itself, the network, the cloud, the software, or the consumer. Attacks are imminent. A study from AT&T, for instance, revealed a stunning 458 percent increase in vulnerability scans of IoT devices in the course of two years. Hackers usually exploit combinations of vulnerabilities to perform an attack.

It's Time To Upgrade To TLS 1.3 Already

The designers of TLS 1.3 chose to abandon the legacy encryption systems that were causing security problems, keeping only the most robust. That simplicity is perhaps one of the reasons it will be ready in half the time it took to design its predecessor. Connections will still fall back to TLS 1.2 if one end is not TLS 1.3-capable -- but if a MITM attacker attempts to force such a fallback, under TLS 1.3 it will be detected, Valsorda said. Almost 93 percent of the websites in Alexa's top one million supported TLS 1.2 as of January, up from 89 percent six months earlier, according to a survey by Hubert Kario's Security Pitfalls blog. But seven percent of one million means a lot of websites are still running earlier and even less secure protocols. Among the laggards are some sites you would hope to be on top of security: those taking online payments.

Quote for the day:

"A good programmer is someone who always looks both ways before crossing a one-way street." -- Doug Linder

Daily Tech Digest - June 22, 2017

The future is not the cloud or the fog: it is actually the SEA!

A SEA device is a complete rethink of how your smartphone works. The phone in your pocket today is basically a fully integrated device consisting of many blocks of hardware and software all dedicated for you and your own purposes alone. With the exception of application software (e.g. gaming, music, personal assistants) that run in some part in the cloud, most everything you do on your device relies in some way or another on a local execution. A SEA device will not necessarily work like this. In the device virtualization paradigm, the same principles that allow virtualization across data centers or the abstractions of EPC elements in the cloud are applied to enable the dynamic decomposition of functions in a device into executable tasks.

10 tough security interview questions, and how to answer them

Suggest establishing an internal mentoring and training program, says Paul Boulanger, vice president and chief security consultant at consulting firm SoCal Privacy Consultants. That way the company can offer the staff personal growth through education and certifications, and a career path within the company itself so there’s an expectation from both sides to lay down roots and make a career, he says. “We want to avoid burnout with particular positions, so part of the training [would involve] job rotation,” Boulanger says. “Individuals will both be able to learn new technologies and stay fresh. We see this in the DevOps/agile movement now where developers are expected to be ‘full stack.’ We should encourage this on the security side too. It makes for better employees.”

Intelligence Panel Learns How to Hack Air-Gapped Voting Systems

How can air-gapped systems be hacked? Halderman explained that prior to an election, voting machines must be programmed with the design of the ballot, the races and candidates. Typically, he said, the programming - known as an election management system - is created on internet-connected desktop computers operated by local election officials or private contractors. Eventually, data from the election management system are transferred to voting machines. "Unfortunately," Halderman said, "election management systems are not adequately protected, and they are not always properly isolated from the internet. Attackers who compromise an election management system can spread vote-stealing malware to large numbers of machines." Another common perception is that because of the complexity and highly decentralized nature of the American election system, the results from a presidential election cannot be altered.

How containers will transform Windows 10

Helium, or application siloing, exists in Windows 10 today as part of the Creators Update, and especially Windows 10 S. This technology enables legacy Win32 applications to be ported to the Windows Store, using the Desktop Bridge (formerly code-named Project Centennial) to package apps. Application silos allow legacy Windows apps to install and update like native Modern Windows 10 apps. These converted desktop apps have full access to system resources, but use a virtual file system and virtualized registry entries like those associated with User Account Control (UAC) virtualization. A Helium-based container isn't a security boundary in the way that a Hyper-V virtual machine is. It lives on top of the existing registry and file system. You can think of it as the next generation of UAC but applied at an application level rather than a machine level.

Why and how to migrate cloud VMs back on premises

Before even thinking about a reverse migration, there are a number of nontechnical items that you need to consider. First, what does your contract say regarding early termination or leaving the cloud provider? This is more of a problem with smaller cloud service providers than with Amazon Web Services, Azure or Google, but it's worth checking, irrespective of provider. Also, check your licensing. An administrator can't just migrate a cloud VM back on premises and continue using the VM as if it still existed in the cloud. Prior to the migration, you need to check both the OS and application licensing small print. Be smart; get that confirmation in writing. Most likely, you'll find that different licensing rules apply. If you're trying to migrate a platform-as-a-service (PaaS) offering back to your data center, you need to have all the PaaS dependencies lined up.

How Will Analytics, AI, Big Data, and Machine Learning Replace Human Interactions?

“Amelia is working in areas such as wealth management, where she interacts with financial advisors: “They’re looking for the right answer the first time in as short a period of time as possible,” and Amelia was able to deliver just that. … The reaction from the executive team when he had demonstrated Amelia’s ability to answer questions “within seconds of that question being asked, the first time, correctly” had been overwhelmingly positive.” With the newer generation chat bots like Amelia, companies present an immersive and personalized interaction tool with customers, often on their web sites, able to access key data in knowledge management and then, using AI, to tee up the “best-fit” answers to questions (a) that are being asked by customers or (b) should have been asked by them.

AMD launches its Epyc server chip to take on Intel in the data center

Epyc will be socket-compatible with the next generation of the product family, and it also has a dedicated security subsystem, where AMD is burning cryptographic functions into the silicon of the memory controllers, effectively encrypting memory, Moorhead noted. This is AMD's third big try in the server market; it has had enough success and failure to say it knows what it takes to be successful. When it came out with the Opteron Dual Core processor in 2005, offering a twofold single-socket performance advantage over Xeon, it grabbed 20 percent of the market within two years. But a few years later, bugs and postponements in the launch of its Barcelona chip architecture allowed Intel to recapture lost ground.

Why Cisco’s new intent-based networking could be a big deal

A key component of an IBNS is that it provides mathematical validation that the expressed intent of the network can be and is implemented within the network, and that it has the ability to take real-time action if the desired state of the network is misaligned with the actual state. An IBNS is, in theory, a software platform that can be agnostic to the hardware that it runs on. The idea of IBNS has been around for a couple of years, Lerner says, but there have been very few platforms that can enable it. A handful of startups, such as Apstra, Veriflow and Forward Networks have some early components of IBNS in various product offerings. Lerner estimates there are less than 15 intent based-networking platforms in production deployments today, but the number could grow to more than 1,000 by 2020.

How to stop wasting money on security shelfware

Shelfware is not inevitable, and it can be reduced or even eliminated by some proactive and surprisingly simple first steps. Infosec professionals believe it comes down to a more controlled acquisition process, sweating the products you already have -- and getting the basics right before acquiring new solutions. “First, leverage the products that have the broadest of capabilities, something that can give breadth of coverage,” says Malik. “This will help get a lay of the land and understand the challenging areas which can then be focussed on more specifically. Don’t try to boil the ocean, but start from critical assets. Finally, the best way is to experiment with the product and network with peers to see how they have deployed capabilities. Security doesn’t need to be a complex offering -- often it boils down to doing the basics well and consistently.”

GitLab's CEO Sid Sijbrandij on Current Development Practices

The open source model fell short in being able to build a business around it. You need significant work on installation, performance, security and dependency upgrades. If everything is open source you can only make money on support. Taking what we learned in 2013, we engineered GitLab to be user friendly to install and maintain so after one year of subscribing organizations quickly figured out that they did not use the support at all. Therefore, building a business model around support wouldn’t have been sustainable. Instead, we decided that there are some features and functions that are more useful to large development teams, like an enterprise organization. By offering extra functionality to customers with larger development teams or even more advanced needs, we continue to show our value through our product offering.

Quote for the day:

"Education ... is a process of living and not a preparation for future living." -- John Dewey

Daily Tech Digest - June 20, 2017

How to make sure your big data solution actually gets used

To gain visibility and automate steps at every stage of the food pick, pack, ship and deliver process, food producers, shippers, warehouses and retailers use handheld devices, barcode scanners, hands-free, voice-based technology and even sensors placed on pallets, packages and refrigeration compartments in trucks. These sensors track temperature, humidity and tampering of the containers for perishables and other goods, and also issue auto alerts to supply chain managers as soon as one of these conditions is violated. Everyone in the food supply chain knows where every shipment is. Along the way, big data is collected in a central data repository where queries and reports are subsequently run to assess how well the supply chain is performing.

Intel Core i9 review: The fastest consumer CPU prepares for Ryzen war

Like most major Intel launches, the Core i9 family represents a new platform, not just a new CPU, which means a new chipset, the X299, and a new socket, the LGA2066, all incompatible with previous CPUs.  The new platform also does something no previous one did by unifying two CPU families. Before today, if you wanted the company’s latest Kaby Lake core, you had to buy a motherboard using the LGA1151 socket. And if you wanted to buy, say, a 6-core Skylake CPU such as Intel’s Core i7-6800K, you had to buy an LGA2011 V3-based motherboard. With X299 and LGA2066, you can now pick your poison, because the platform encompasses everything from a 4-core Core i5 Kaby Lake CPU to an 18-core Core i9 Extreme Edition, which is a Skylake CPU.

Enlightened shadow IT policy collaborates with users

Now, the advantages of cloud services are changing shadow IT policy in many enterprises. The flat-out blocking of cloud services is unacceptable in most organizations today because team collaboration apps, for example, are useful to lines of business and work groups that use them to improve productivity. These apps are quick to deploy and eliminate the need for IT's permission or deployment. "Any IT leader who stands in the way of productivity probably isn't going to hold the job too long," Schilling said, adding that cloud services typically represent an opportunity, not a hindrance. Just like on-premises shadow IT efforts, however, Schilling knows that if something goes wrong and business users find themselves in trouble, IT will have to come to the rescue. "Rather than fighting it, we have to offer users governance and guidelines, he said.

The Rising Business Risks of Cyberattacks and How to Stay Safe

According to 2017 Internet Security Threat Report by Symantec, cyber criminals have revealed new levels of ambition and malice. Data breaches are now driven by innovation, sophistication, and organization to produce ominous results. Cybersecurity has become more of concern for businesses. This year, continue to face complex security threats. There is a growth of new malware that can bypass your antivirus and other levels of protection. Ransomware is on the rise. More than 4,000 ransomware attacked have occurred every year over the last one year. Ransomware and phishing work together with statistics from PhishMe showing a rising trend. When it comes to data breaches, the risk for organizations is high. The risks can range from the easily calculable costs of notification and business loss to the less tangible effect on a company’s brand and customer loyalty.

What you need to know about Power BI now

Starting in an Excel-like table view of your raw data, you use the query tools to construct a series of transformation steps, adding columns and changing data types using a formula-like approach. Once you’ve constructed a query, an advanced editor shows the resulting Power Query code, ready for additional editing or adding new steps. Power BI’s visual editing tools also help simplify your data, removing unwanted columns and changing names. Data from other sources can be merged into your query, adding additional information where necessary. Other tools pivot data into aggregate tables or add custom columns based on calculations. Sharing reports is as important as building them, and Power BI gives you several options. Perhaps the most useful is the ability to build and publish web dashboards that show key performance indicators and tie them to appropriate visualizations.

Excel 2016 cheat sheet

Excel has never been the most user-friendly of applications, and it has so many powerful features it can be tough to use. Excel 2016 has taken a good-sized step towards making it easier with a new feature called Tell Me, which puts even buried tools in easy reach. To use it, click the "Tell me what you want to do" text, to the right of the View tab on the Ribbon. (Keyboard fans can instead press Alt-Q.) Then type in a task you want to do, such as "Create a pivot table." You'll get a menu showing potential matches for the task. In this instance, the top result is a direct link to the form for creating a PivotTable -- select it and you'll start creating the PivotTable right away, without having to go to the Ribbon's Insert tab first. If you'd like more information about your task, the last two items that appear in the Tell Me menu let you select from related Help topics or search for your phrase using Smart Lookup.

Data should be stored in space, firm says

Data security will be another advantage when it comes to space-held data, the company says. It says “leaky internet and leased lines” are subject to “hijacking, theft, monitoring and sabotage” and that its dedicated telecom backbone network won’t be. In fact its “network-ring” won’t be connected to the internet it says. Better throughput, too, is obtained by “avoiding traditional terrestrial ‘hops,’” it claims. SpaceBelt’s still-to-be-launched data center platform will operate in low-earth orbit (LEO). That’s the area between the Earth’s surface and 1,200 miles up, and it is the same zone that SpaceX and the OneWeb Internet infrastructures will use for their upcoming broadband constellation roll-outs. Cloud Constellation Corp. expects to build eight satellites for testing at the end of 2018, according to an interview chief executive Scott Sobhani gave with SpaceNews Magazine last year.

5 Steps to Prepare for the Inevitable Cyber Security Attack

To determine how much insurance coverage you need, use a calculator, assessment tool, or modeling to assess your overall risk. Paez recommended using an interruption worksheet, similar to what you may see for property insurance. Your insurance can provide templates, employee awareness training, regulatory preparedness, and PCI compliance readiness. Look at cyber attack risks from a business interruption perspective. “There may be organizations that are not in the, what I would term ‘high hazard’ class– business retail, hospitality, financial institutions, healthcare,” Paez explained. “If you’re outside of that realm looking at it from a business interruption standpoint or supply chain perspective, or utility or critical infrastructure, that’s a different conversation altogether in terms of assessing that risk. ...”

Cybersecurity spend: ROI Is the wrong metric

While this article has focused on helping board members and C-suite executives understand how to quantify the value of their cybersecurity investment, the InfoSec team may need to assist in the effort. If management is making the mistake of asking IT to justify its cybersecurity budget in terms of ROI, the InfoSec team needs to educate management as to why the ask is wrong and refocus them on the correct one. Furthermore, when making your argument against focusing on ROI, you need to provide the right data to support your point. Based on my experience, when asked to report on the security readiness of the network, most teams simply provide management with an exhaustive list of every potential threat that could harm the network; the strategy being that, when management sees a list of thousands of potential threats, they’ll agree to any budget out of fear and misunderstanding.

Enterprise network monitoring tools address companies of all sizes

GroundWork offers a manual process for overlaying network infrastructure on geographic maps. Users can upload the image of a floor plan, topology, architectural software diagram or geographic map. Then, they can overlay GroundWork's performance and availability indicators on the image and drill down into those indicators for further analysis. The system's performance visualization features enable users to set dynamic thresholding and spot performance bottlenecks. The system gathers data from the network via SNMP, APIs, intelligent platform management interfaces, and a variety of other protocols and interfaces. Via these APIs, GroundWork has added the ability to monitor hybrid cloud environments. It integrates with cloud providers via a REST API, and it has out-of-the-box support for Amazon Web Services and OpenStack.

Quote for the day:

"It is not fair to ask of others what you are not willing to do yourself." -- Eleanor Roosevelt

Daily Tech Digest - June 19, 2017

A Data-Driven Approach to Identifying Future Leaders

Those with high motivation potential showed resilience and confidence in their capacity to lead; those who scored lower on this dimension were less likely to persevere when faced with new and unknown situations. Those who possessed strong people potential were empathetic and more adept at building relationships than their less people-savvy peers. And leaders with high change potential were able to move out of their comfort zones to experiment and take necessary risks; those who were more averse to change had more difficulty going against the status quo. ... Too many organizations eliminate talented leaders from consideration because the criteria used to determine potential are subjective and inconsistent. If created carefully, a clear, consistent definition of leadership potential can reduce the potential for bias, increase diversity, and save money by ensuring that the organization invests in high-potential employees early in their careers.

What Careers Are Safe From Automation And The Robot Takeover?

Jobs all across the business and finance landscape will be heavily affected in the manner of insurance underwriters: book keepers, accountants, auditors, loan officers, tellers, clerks, and postal service workers will easily be replaced by artificial intelligence. The legal profession is another highly populated sector that will have a difficult time as the need for secretaries, paralegals and court reporters will decline. And if experts are correct in their projections, the very top business leaders might not be immune either. Jack Ma from Alibaba, recently said that CEOs themselves could be on the chopping block, going so far as to predict that "In 30 years, a robot will likely be on the cover of Time Magazine as the best CEO." Ma paints a bleak picture of what the three transitional decades could look like for those who are “unprepared for the upheaval technology is set to bring.”

Approximately 350 000 current cybersecurity openings in US

In 2017 the U.S. employs nearly 780,000 people in cybersecurity positions, with approximately 350 000 current cybersecurity openings, according to CyberSeek, a project supported by the National Initiative for Cybersecurity Education (NICE), a program of the National Institute of Standards and Technology (NIST) in the U.S. Department of Commerce. The current number of U.S. cybersecurity job openings is up from 209,000 in 2015. At that time, job postings were already up 74 percent over the previous five years, according to a Peninsula Press analysis of numbers from the Bureau of Labor Statistics. Security starts at the top. Right now, about 65% of large U.S. companies have a CISO (Chief Information Security Officer) position, up from 50% in 2016, according to ISACA, an independent, nonprofit, global association.

Cybersecurity in an IoT and mobile world: The key trends

Cybersecurity incidents regularly hit the headlines, the WannaCry ransomware outbreak in mid-May being a particularly high-profile example. It says a lot about the current state of cybersecurity that the escalation of ransomware had been widely predicted, that the crude but effective WannaCry attack could easily have been defended, and that the perpetrators -- despite the attentions of multiple security firms and government agencies -- remain undiscovered (at the time of writing). Talking of predictions, at the start of the year ZDNet's sister site Tech Pro Research examined 345 cybersecurity predictions for 2017 from 49 organisations, assigning them among 39 emergent categories. Here's the ranking of topics that cybersecurity experts were worried about six months ago:

Attack of the Algorithms: Value Chain Disruption in Commodity Trading

The securitization of contracts involves ­creating standardized products fromlarge-scale, nonstandard contractual ­agreements between two parties. Examples include agreements regarding the long-term off-take of LNG and structured ­investment products, including those based on energy consumption patterns. Many traders in less-­developed commodity ­markets have created a business based on the securitization of contracts. Their ­business model will face growing pressure as commodity markets become increas­ingly ­developed and as more short-term ­markets emerge, offering greater ­liquidity, price transparency, and ability to hedge risk. That evolution is already evident in a ­number of markets, including ­European gas. In addition to the risks for traders, ­however, there will also be opportunities.

Designing for an unpredictable future

In recent years we have seen a surge of this kind of design thinking in business and a drive in governments to apply design principles to policy. An insurgency of Innovation Labs have sprung up like Sitra in Finland, Mindlabin Denmark, 18F in the US, and Policy Lab in the UK. Experimentation, prototyping, and openness underpin this way of operating which puts users first and brings in agile methods from tech and design communities to innovate in new ways. To encourage this, public institutions and charitable foundations have opened up challenge prizes to stimulate markets and promote design-led innovation. Social impact investment funds and incentives like the industrial strategy challenge fund in the UK now seek to drive innovation further.

11 predictions for the future of programming

When kids in college take a course called “Data Structures,” they get to learn what life was like when their grandparents wrote code and couldn’t depend on the existence of a layer called “the database.” Real programmers had to store, sort, and join tables full of data, without the help of Oracle, MySQL, or MongoDB. Machine learning algorithms are a few short years away from making that jump. Right now programmers and data scientists need to write much of their own code to perform complex analysis. Soon, languages like R and some of the cleverest business intelligence tools will stop being special and start being a regular feature in most software stacks. They’ll go from being four or five special slides in the PowerPoint sales deck to a little rectangle in the architecture drawing that’s taken for granted.

Perpetuating Bias: Why We Should Think Critically About AI in Marketing

“We have a situation where these artificial intelligence systems may be perpetuating historical patterns of bias that we might find socially unacceptable and which we might be trying to move away from.” While many people would assume that artificial intelligence algorithms are objective tools making objective calculations, the fact is these tools are created from and trained on large sets of data (images, text, video, etc.) that currently exist online. This is data that has been created by humans, and thus is data that’s not free from bias. When AI algorithms and content intersect, we need to be careful about the results. The danger with overuse of artificial intelligence in marketing is that our dominant, biased discourses will remain dominant and biased, especially if we assume an AI tool is taking an objective tack.

How to Beat the Odds and Make Your First IoT Project a Success

IoT solutions affect multiple teams within the organization. Partner with these affected teams early in the planning process to get their requirements, gain their support (knowledge, resources, and budget), and leverage their influence to remove barriers during the execution stages. Partner with your organization’s digital transformation or innovation office, if one exists. Equally important, partner with IoT solution vendors throughout the process. At this stage of the market, their solutions are still evolving. Work with your IoT vendor at a deeper level than you would with other vendors. Stay in close contact and leverage their product management and technical support teams throughout the project. Co-design the solution and project with them – tell them what features you like to see, report bugs, and test updated versions of the product.

Understanding the limits of deep learning

By contrast, humans “learn from very few examples, can do very long-term planning, and are capable of forming abstract models of a situation and [manipulating] these models to achieve extreme generalization.” Even simple human behaviors are laborious to teach to a deep learning algorithm. Let’s examine a situation such as avoiding being hit by a car as you walk down the road. If you go the supervised learning route, you’d need huge data sets of car situations with clearly labeled actions to take, such as “stop” or “move.” Then you’d need to train a neural network to learn the mapping between the situation and the appropriate action. If you go the reinforcement learning route, where you give an algorithm a goal and let it independently determine the ideal actions to take, the computer would need to die thousands of times before learning to avoid cars in different situations.

Quote for the day:

"Leadership is not a solo sport; if you lead alone, you are not leading." -- D.A. Blankinship