Daily Tech Digest - March 31, 2018

visa.jpg
Under the new proposal, individuals would be required to disclose social media usernames (though not passwords) when applying for a visa to enter the United States, which would affect nearly 15 million people per year, according to the AP report. The proposal would require applicants to disclose five years worth of social media usernames on platforms identified in the application form (Facebook, Twitter, etc.) while providing a separate field for applicants to volunteer the usernames of platforms not specifically identified in the form. Previous implementations of this rule applied only to people individually identified for additional background checks, which the AP report indicates is about 65,000 people annually. The idea of collecting social media information of visa applicants started during the Obama administration following the 2015 San Bernardino attack. 


Everything You Were Afraid to Ask About Crypto Taxes

Many cryptocurrency investors have made a fortune the past several years selling high-flying bitcoin and other cryptocurrencies for cash. Unfortunately, far too many of them in the U.S. did not report this taxable income to the IRS. The agency figures hundreds of thousands of U.S. residents did not report income from sales or exchanges of cryptocurrency and it might be able to collect several billion dollars in back taxes, penalties, and interest. ... The character of the gain or loss generally depends on whether the virtual currency is a capital asset in the hands of the taxpayer. If it is, a taxpayer generally realizes a capital gain or loss on the sale. If not, the taxpayer realizes an ordinary gain or loss. The distinction is more than academic. ... When a taxpayer successfully mines virtual currency, the fair market value of the virtual currency generated as of the date of receipt is includable in gross income.


Don’t get surprised by the cloud’s data-egress fees

Don̢۪t get surprised by the cloud̢۪s data-egress fees
Keep this in mind: Most companies that use public clouds pay these fees for day-to-day transactions, such as moving data from cloud-based storage to on-premises storage. Those just starting out with cloud won’t feel the sting of these fees, but advanced users could end up pushing and pulling terabytes of data from their cloud provider and end up with a significant egress bill.  It’s not major money that will break the budget, but egress fees are often overlooked when doing business planning and when considering the ROI of cloud hosting. Indeed, for at least the next few years, IT organizations will be making their cloud-based applications and data work and play well with on-premises data. That means a lot of data will move back and forth, and that means higher egress fees. My best advice is to put automated cost usage and cost governance tools in place to make sure you understand what’s being charged, and for what services.


Agile’s dark secret? IT has little need for the usual methodologies

Buy-when-you-can/build-when-you-have-to is profoundly different. When IT implements multiple COTS/SaaS packages, existing databases make the job harder, not easier. Each package comes with its own database, and when the software vendors design their databases they don’t take the ones you already have into account because they can’t. They also don’t take the ones other vendors have already designed into account because why would they? So when IT implements a COTS package it has to track down all the existing databases that store the same data the new application manages too — not to take advantage of it, but to figure out how to keep the overlapping data synchronized. When it comes to managing data, internal development and package implementations are completely different.


Cyber threat to energy infrastructure, Kapersky Lab research finds

Cyber threat to energy infrastructure, Kapersky Lab research finds
Cyber security incidents and targeted attacks over the past couple of years, along with regulatory initiatives make a strong case for the power and energy companies to start adopting cyber security products and measures for their operational technology (OT) systems. Moreover, the modern power grid is one of the most extensive systems of interconnected industrial objects, with a large number of computers connected to the network and a relatively high degree of exposure to cyber threats, as demonstrated by Kaspersky Lab ICS CERT statistics. In turn, the high percentage of attacked ICS computers in engineering and ICS Integration businesses is another serious problem given the fact that the supply chain attack vector has been used in some devastating attacks in recent years.


ICS cybersecurity: The missing ingredient in the IoT growth equation

There’s a lot to be gained by adopting connected IoT or IIoT technologies within OT networks and industrial control systems (ICS) environments. By using common internet protocols combined with the cost-savings of using connected terminals, industrial operations can utilize real-time analytics and multisite connectivity to improve efficiencies across numerous industrial verticals. So, why have ICS practitioners and stakeholders not adopted these new technologies? One word: security. As OT networks begin to integrate more intelligence, such as intelligent human-machine interface and cloud SCADA, ICS practitioners are now unable to reconcile the new security risks that have been created as a result. Since OT networks control critical infrastructure and processes, network failure inherently comes at a greater consequence than in typical IT networks. The potential for substantial financial loss, environmental damage and even loss of human life resulting from a security breach is a real possibility in the industrial realm.


Why Blockchain adoption has stalled in financial services and banking

blcokchaimn.jpg
To be sure, there's no shortage of hype and hope about what blockchain could do for banks and other financial services firms. As The Financial Times detailed, banks could use blockchain technology for everything from the recording and updating of customer identities to the clearing and settlement of lands and securities. ... The reality of blockchain, however, is that they aren't. While there are patches of blockchain activity—Northern Trust using a distributed ledger to manage private equity deals in Guernsey, and ING attempting to build a blockchain for agricultural commodities—Penny Crosman, reporting for American Banker, has declared that adoption has "stalled" for a variety of reasons. The first, ironically, is that most banks still don't have a clear business case for using it. Beyond confusion as to why the banks should be using blockchain at all, there are also concerns about security, legal issues, and the immaturity of the technology itself.


Data theft is the foremost threat for an insurance company

For an insurance company the data theft is the major threat. “For example, whenever a policy comes for renewal, the policy holders start receiving the calls from multiple companies, mostly your competitors. Here, despite ensuring all the security measures, the human factor remains the weakest link,” Dhanodkar explains. Its imperative to enrich people including employees, customers & third-party vendors with the adequate awareness on latest threats associated with the digitized economy. Mere Bbasic security awareness will not be effective unless the knowledge is upgraded. Aadhaar Act has also got lot of implementations for security framework and we are planning to have series of workshops on Aadhaar Act compliance requirements s for executives” he said. The CISO’s role demands sound understanding of “Technology and business. Being a CISO I cannot keep blocking everything but must act as an enabler, he added.


Should software developers have a code of ethics?

Should software developers have a code of ethics?
Teaching people to ask the right questions involves understanding what the questions are, says Burton, and that everyone’s values are different; some individuals have no problem working on software that runs nuclear reactors, or developing targeting systems for drones, or smart bombs, or military craft. “The truth is, we’ve been here before, and we’re already making strides toward mitigating risks and unintended consequences. We know we have to be really careful about how we’re using some of these technologies. It’s not even a question of can we build it anymore, because we know the technology and capability is out there to build whatever we can think of. The questions should be around should it be built, what are the fail safes, and what can we do to make sure we’re having the least harmful impact we can?” he says. Burton believes, despite the naysayers, that AI, machine learning and automation can actually help solve these ethical problems by freeing up humans to contemplate more fully the impacts of the technology they’re building.


How utility industries can leverage location data, AI and IoT

Organizations are looking at how AI and IoT can reduce cost, drive efficiencies, and enhance competitive advantage and support emerging business models. It is also clearly observed that some technical innovations from the mainstream of the IT world, or from other industries, are creating opportunities to leverage technology that did not exist previously in the industry. The industry has, in the past, pursued a siloed approach to applications and technologies. This is characterized by the separation of the engineering and operations groups from IT, and the use of stand-alone, best-of-breed applications within the overall scope of IT. As ubiquitous connectivity continues to permeate technology sectors, an increasing need to unite energy technologies, operational technologies (such as sensors and smart devices) and IT (such as big data, advanced analytics and asset performance management [APM]) with consumer technologies is observed in the industry.



Quote for the Day:


"If you're not failing once in a while, it probably means you're not stretching yourself." -- Lewis Pugh


Daily Tech Digest - March 30, 2018

New ways to trade data


IN 2016, according to Cisco, an American technology group, the volume of data flowing through the internet each month passed a zettabyte, enough to fill some 16bn 64GB iPhones. By 2025 it will be many times greater. Immeasurably more data sit outside the public internet on company servers. Most of these data are valuable information, which means that people are keen to trade it. Typically, data deals are at present worked out between someone holding the information and those who want to extract insights from it. For instance, Uber has deals allowing many cities to access data generated by its fleet of drivers. This helps city planners understand traffic flows. ... These new data markets face stiff challenges. Maintaining individual privacy and monitoring questions to prevent corporate leaks will be difficult. The cryptography securing the network needs to be airtight. Perhaps the biggest challenge will be convincing people to use them.



Enterprise Architecture Framework - Non-Functional Attributes

Non-Functional Attributes (NFAs) always exist though their signficance and priority differs when considered with certain other functional or non-functional attribute. It’s particularly important to pay attention and consider them in the inital phase of the EA framework development, as these attributes may have direct or indirect impact on some of the functional attribute of the framework. Considering Non Functional attributes early in the lifecycle is important because NFAs tend to be cross-cutting, and because they tend to drive important aspects of your architecture, they do cause considerable impact on certain important aspects of your test strategy. For example, security requirements will drive the need to support security testing, performance requirements will drive the need for stress and load testing, and so on. These testing needs in turn may drive aspects of your test environments and your testing tool choices.


Hadoop 3.0 and the Decoupling of Hadoop Compute From Storage


In 2018, discussions about big data infrastructure no longer revolve around methods to reduce network traffic through the use of clever data placement algorithms; instead, there are now more discussions about how to reduce the cost of reliable, distributed storage. The Hadoop open-source community has brought this discussion to the forefront with the recent introduction of Apache Hadoop version 3.0. One of the key features of Hadoop 3 is Erasure Coding for the Hadoop Distributed File System (HDFS), as an alternative to the venerable HDFS 3x data replication. Under typical configurations, Erasure Coding reduces HDFS storage cost by ~50% compared with the traditional 3x data replication. Over the past few years, the Hadoop community has discussed the potential storage cost savings that Erasure Coding will bring to HDFS; and many have questioned whether 3x data replication still makes sense, given the advancements in hardware and networks over the last ten years.


Why unemployment isn't the robots' fault, it's ours

Robots work on the production line at AkzoNobel's new paint factory in Ashington, Britain  September 12, 2017. REUTERS/Phil Noble
New technology creates a shift. The inventions of the industrial era led to the decline of blacksmiths, but it did give rise to the steelworker. AI, similarly, has the potential to create such opportunities. The CRO of Business Insider Peter Spande was reported by CNBC as saying that, this year nearly US$ 2 billion has been spent on AI advertising alone. Technology research and analysis company Gartner estimates that by the year 2020 AI field will have created 2.3 million jobs. About this estimate, Svetlana Sicular of Gartner says, “unfortunately, most calamitous warnings of job losses confuse AI with automation — that overshadows the greatest AI benefit — AI augmentation — a combination of human and artificial intelligence, where both complement each other”. There will, in fact, be job losses to the tune of around 1.8 million — mainly mid and low-level positions — but new ones will be created in the highly skilled, management and even entry level. Gartner estimates that by 2022, one in five workers who conduct non-routine tasks will depend on AI to do their job.


dark data and big data analytics
A report from Deloitte titled Dark Analytics: Illuminating Opportunities Hidden Within Unstructured Data Tech Trends 2017 discusses how some medical facilities could use dark data to take more all-encompassing approaches to patient care. For example, during consultations, doctors may take handwritten notes and capture voice recordings, plus make notes in emails or cloud-based applications. Collecting it all and making it accessible could improve treatments and insights, reducing the instances of incorrect diagnoses or interventions that don’t work as well as more appropriate options. People in the health care field are also hopeful that dark data could make it possible to analyze population groups. The previously unused data could potentially make predictions about future needs and illness trends that could ultimately affect individuals’ interactions with health professionals and help local health departments understand the situations their staff members will most likely encounter.


AI in Banking – An Analysis of America’s 7 Top Banks

AI in Banking - An Analysis of America's 6 Top Banks
While tech giants tend to hog the limelight on the cutting-edge of technology, AI in banking and other financial sectors is showing signs of interest and adoption – even among the stodgy banking incumbents. Discussions in the media around the emergence of AI in the banking industry range from the topic of automation and its potential to cut countless jobs to startup acquisitions. ... Through facts and quotes from company executives, this article serves to present a concise look at the implementation of AI by the seven leading commercial banks in the U.S. as ranked by the Federal Reserve. Changes in the banking industry directly impact businesses and commerce, and we sought to provide relevant insights for business leaders and professionals interested in the convergence of AI and financial technology. We’ll explore the applications of each bank one-by-one. The top seven US banks below have been rank-ordered by their size, starting with JPMorgan Chase, the largest.


The Future of Financial Services Using Blockchain Technology Infrastructures

Fintech overview infographic
Financial innovation can be interpreted as common developments happening gradually in the financial services industry. This includes contemporary markets, technologies, instruments, or institutions. Fintech refers to a distinct area of financial innovation where the centre of interest is transformative technology. Fintech is short for financial technology. A great example of fintech is a P2P lending platform called Zopa, which gives people access to loans directly from connected devices. On the other hand, financial innovations may sound like the same thing, however they are different devices and institutions which enable people to use financial services. Existing examples of financial innovations include debit cards, ATMs and traditional banking services. ... Traditional banks have been around for centuries. Fintech is bringing banking into the modern age, however it now threatens to outgrow banking completely. They could become superior to banks because they are able to curate big data and offer flexibility in managing money in ways that banks would need to be redesigned from the ground up to match. 


Robo-Adviser Startups Are Now Going After Wealthy Clientele

Robo-advisers started out by going after people with limited disposable income and little experience with investing. These companies are able to offer wealth management tools at low fees because they rely mostly on automated software, instead of people, to deliver advice. Customers fill out a risk profile by answering questions about their age and goals to receive a customized portfolio made up of exchange-traded funds and other passive investments. Competition has upped the urgency for these startups to add new products. Charles Schwab Corp., Morgan Stanley and Vanguard Group Inc. have all introduced robo-advisers in recent years and have billions of assets under management already. Betterment and Wealthfront have said they each have more than $10 billion in assets. “The industry has come a long way over the past several years, but success has also attracted new competition, including large and established players across the financial services industry,” said Devin Ryan, an analyst at JMP Securities LLC. “Increasing competition within the digital wealth management space appears to be accelerating the pace of innovation.”


Saving lives with big data analytics that predict patient outcomes


Cerner’s EDH helps them understand the most significant risks and opportunities for improvement across a population of people. Cerner computes quality scores for manag­ing a number of chronic conditions, and analysts can see which conditions could gain the most by improving those scores. For instance, Cerner can accurately determine the probability that a person has a bloodstream infection, such as sepsis. Sepsis is an uncontrolled inflammatory response to an infection. It is a complex condition which is difficult for a junior doctor or nurse to recognise. From the time sepsis first takes hold, healthcare professionals have only the initial 6 hours after the diagnosis to deliver a group of interventions to the patient. These interventions require close and rapid interaction between teams in the Emergency Department, in the general ward and in Critical Care. For an individual patient, getting the interventions right at the right time may mean a 20-30% better chance of surviving.



The Fourth dimension on agile leadership

“Agile development can easily stall without agile leadership in place,” says Berthelsen. “Here at Fourth, we don’t lead from the top down – we lead from the bottom up. The leadership team inspires all members of staff to become agents for change. Everyone in the organisation has the personal responsibility, accountability and authority to deliver on our clients’ requirements. Once we’ve agreed on an action, we agree as a business. That open style of leadership makes the life of the product owner and the software engineering teams so much easier, as they’re not in the middle of a conflict of priorities.” As the CTO at Fourth, Berthelsen is also best placed to unpick what the characteristics are of a successful agile leader. “It’s all about openness and collaboration. Agile leaders set the vision and give individuals the freedom to follow that vision. They create an environment for people to push decision making authority down to those closest to the information and they remove barriers so individuals can respond in real-time to unfolding situations.” ... Every stakeholder is trained on what agile is, why we’re doing it, and they participate in it every day.



Quote for the day:


"Leadership appears to be the art of getting others to want to do something you are convinced should be done." -- Vance Packard


Daily Tech Digest - March 29, 2018

8 Things to Consider Before Hopping on the Blockchain Train

Image: Shutterstock
There’s often a financial aspect that confuses some great ideas with how they’ll be monetized. Businesses that need capital to properly develop their blockchain solution often turn to ICOs, or initial coin offerings, to make it happen. With few regulations, countless projects are setting up a smart contract that offers contributors tokens in exchange for their BTC or ETH. Though it’s an easy way to get some liquid funding, launching an ICO comes with extra responsibilities, other than making a solid service or platform. You’ll also be beholden to a large community of speculators who have only one thing in mind: the price of the token. This complicates matters and may distract from the original goal. Companies like Telegram, though already fully funded and successful, have chosen to ICO even in the post-IPO phase, so that they can fund blockchain-specific projects without dipping into their coffers or raising money the traditional way. Smart contracts, which are a central part of most blockchain solutions, allow savvy programmers to create a tokenized ecosystem whereby all users benefit from participating.



4 reasons why 2018 is the year millennials create a flashpoint for the agile revolution

A recent Gallup poll published findings that millennials don’t want bosses-they want coaches and millennials don’t want annual reviews-they want ongoing conversations. In agile settings, frequent coaching conversations and immediate feedback are built into the framework. Indeed, some of the earliest resistance to agile came from senior and middle managers because they were forced to challenge their notion of how they add value and it required them to shift from delivering mandates to coaching and empowering. For example, the practice of retrospectives highlights how agile work teams use feedback and coaching to continuously improve. Millennials crave and respond well to these opportunities to better their performance. To borrow a phrase from Billy Joel, millennials “didn’t start the fire” when it comes to agile, however, they are poised to fan the flames as the framework is adopted in new industries and continues to be an effective means to solve problems, produce new products and meet customer needs.


3 factors CIOs must address to stay at the strategy table


Successful CIOs and CISOs are able to educate and communicate with boards of directors. Are you speaking in their language? I talk to many boards, and while the members are hungry for information about these issues, they struggle with context and practicality. For example, it might be easy to see technology’s role in an email outage, but less clear what role new technologies and information security play in preventing a supply-chain disruption. Directors always understand the financial impact, and the more sophisticated CIOs are adept at translating their team’s work into the financial terms that board members get right away. ... In summary, failing to do these things might not have immediate consequences – and perhaps you’re not doing all of these things right now. But by neglecting these aspects of your leadership role, you will cede that responsibility to someone else. On the other hand, by paying attention to them, you will bolster your credibility in the age of digital business.


Linux on Raspberry Pi: SUSE support turns $35 board into enterprise IoT platform

Today Upton is thrilled because SLES 12 SP3 is the first time a major vendor has offered a full, commercially supported Raspberry Pi image. "Unlike two years ago when they just provided a downloadable image with community support, SUSE can now offer 12 x 5 or 24 x 7 support," writes Upton. "This is all built on the same SUSE Linux that is available on everything from Raspberry Pi to the mainframe." According to SUSE, companies have been using SLES for Arm on Raspberry Pi for monitoring older industrial equipment such as robotic screwdrivers and sending alerts when they malfunction. The new SUSE Raspberry Pi image still targets the Raspberry Pi Model 3 B, although SUSE says it is planning support for the new Raspberry Pi Model 3 B+. The new version also contains a few updates and fixes. According to SUSE, developers have made the new image smaller -- around 630MB -- by trimming compilers and debugging tools while tuning the Arm OS for IoT tasks.


1.4B stolen passwords are free for the taking: What we know now

passwords
Single-factor authentication based on "something you know" (e.g., a password) is no longer an acceptable best practice. "I'm pretty well convinced passwords are a horrible system," Professor Douglas W. Jones of the University of Iowa, says. "If someone knows your old passwords, they can catch onto your system. If you're in the habit of inventing passwords with the name of a place you've lived and the zip code, for example, they could find out where I have lived in the past by mining my Facebook posts or something." Indeed, browsing through third-party password breaches offers glimpses into the things people hold dear — names of spouses and children, prayers, and favorite places or football teams. The passwords may no longer be valid, but that window into people's secret thoughts remains open. These massive dumps of free passwords lower the cost of an attack dramatically. Password reuse or password guessing attacks are script kiddie stuff. Defending your organization against such threats is basic due diligence.


10 Machine Learning Algorithms You Should Know to Become a Data Scientist

Machine learning practitioners have different personalities. While some of them are “I am an expert in X and X can train on any type of data,” where X = some algorithm, others are “right tool for the right job” people. A lot of them also subscribe to a “Jack of all trades, master of one” strategy, where they have one area of deep expertise and know slightly about different fields of machine learning. That said, no one can deny the fact that as practicing data scientists, we have to know basics of some common machine learning algorithms, which would help us engage with a new-domain problem we come across. This is a whirlwind tour of common machine learning algorithms and quick resources about them which can help you get started on them.


11 outsourcing myths debunked

11 outsourcing myths debunked
Third-generation outsourcers may think they know everything there is to know about structuring engagements for success, but the reality is that the fundamentals of value creation from outsourcing have changed significantly. Consumption-based pricing is replacing fixed-price models. Contracts designed for efficiency and cost reduction have given way to deals aligned to business outcomes and growth. “The fundamental mindset needed to succeed is very different, and a contract written for efficiency does not align with a contract that needs to drive growth,” says Jimit Arora, partner in Everest Group’s IT services research practice. “Smart clients recognize the limitations of previous templates and are willing to make changes. Clients that want to simply re-purpose because they think it is ‘old wine in new bottle’ are going to struggle with ineffective contracts.” ... The typical outsourcing deal is built to reduce disruption and risk associated with major change. Indeed, both the buyer and service provider account teams are incentivized to protect the status quo, says Bob Cecil


What is composable infrastructure?

hpe synergy
IT resources are treated as services, and the composable aspect refers to the ability to make those resources available on the fly, depending on the needs of different physical, virtual and containerized applications. A management layer is designed to discover and access the pools of compute and storage, ensuring that the right resources are in the right place at the right time. The goal is to reduce underutilization and overprovisioning while creating a more agile data center, says Ric Lewis, senior vice president and general manager of the software-defined and cloud group at Hewlett Packard Enterprise, which offers the Synergy composable infrastructure platform. “When a customer logs onto a public cloud, they grab a set of resources: compute, storage, fabric. ‘I need this much stuff to be able to run this application. Please give that to me," Lewis says. "I’ll run this application, and when I’m done, I’ll give it back to you and you can use it with somebody else,’” “What we did with composable infrastructure is build that into the platform. We can do the same dynamic resource sharing.”


Facebook cuts ties to data brokers in blow to targeted ads

The world’s largest social media company is under pressure to improve its handling of data after disclosing that information about 50 million Facebook users wrongly ended up in the hands of political consultancy Cambridge Analytica. Facebook adjusted the privacy settings on its service on Wednesday, giving users control over their personal information in fewer taps. Facebook has for years given advertisers the option of targeting their ads based on data collected by companies such as Acxiom Corp and Experian PLC. The tool has been widely used among certain categories of advertisers - such as automakers, luxury goods producers and consumer packaged goods companies - who do not sell directly to consumers and have relatively little information about who their customers are, according to Facebook. “While this is common industry practice, we believe this step, winding down over the next six months, will help improve people’s privacy on Facebook,” Graham Mudd, a Facebook product marketing director, said in a statement.


Sensors and machine learning: How applications can see, hear, feel, smell, and taste

In essence, classifying objects with machine or deep learning is first a matter of “seeing” a lot of instances of a sheep or a cat, including various derivatives (big ones, little ones, furry ones, less-furry ones, skinny ones, fat ones, tailless ones). Then it is a matter of training a model that recognizes all of the variants. While Facebook and Google are clearly putting the most weight into this field, there are other tools like the venerable OpenCV library, a grab-bag of functionality, and OpenFace, which is focused on just facial recognition. There is even Jevois (French for “I see”), a smart camera for Arduino devices that has pretrained models based on open source libraries. ... For speech recognition, you can find open source implementations that use the more traditional Hidden Markov models like CMUSphinx and Kaldi use a neural network. There are other implementations, but the breakdown is between online and offline decoding. “Online” means can you read off a mic; “offline” means you have to wait until you have a .wav file.



Quote for the day:


"You may be good. You may even be better than everyone esle. But without a coach you will never be as good as you could be." -- Andy Stanley


Daily Tech Digest - March 28, 2018

Cambridge Analytica’s secret coding sauce allegedly leaked


Security firm UpGuard claims that it found a large code repository from AggregateIQ (AIQ), a Canadian political data firm also active in the 2016 US presidential race, left publicly downloadable online. You might remember that data analytics firm’s name for its part in Brexit: the official Vote Leave campaign gave £3.5m to AIQ, which, like CA, specializes in highly targeted Facebook advertising. Over the weekend, The Guardian reported that CA has undisclosed links to AIQ. The Guardian reports that former CA employee/founder turned whistleblower Christopher Wylie has revealed that besides setting up CA, he was also a central figure in setting up AIQ. AIQ and CA’s parent company, SCL Group, are tied by an intellectual property license, but the threads that bind go way beyond that: Wylie says that some CA staff referred to AIQ as a “department” within the company and that the two businesses shared the same underlying technology. According to UpGuard, it found that technology within an open repository that holds a smorgasbord of tools used to influence individuals



Cisco emboldens its disaggregation strategy

Cisco emboldens its disaggregation strategy
“We are already seeing customers use these early capabilities to run applications for network analytics, security, network operations workflows and IoT on the network infrastructure,” Cisco said.  For its data center customers, Cisco said it will now offer a number of portability options for its Nexus switches and the Nexus Operating System (NX-OS) — including the Cisco Cloud Scale Switch Abstraction Interface (SAI). “SAI lets customers the freedom run the network operating system of their choice on our SAI-ready Nexus platforms. Microsoft and other web-scale customers are now running their Sonic operating system on these Nexus 9200/9300 platforms,” Cisco stated.  In addition Cisco said it’s now possible to run its Nexus Operating System (NX OS) on third-party hardware platforms — independent of Nexus switches. Cisco also now offers a virtual NX OS that “will let customers simulate new features during upgrades through the software upgrade impact on existing tooling environment for their actual large-scale topologies.”


Fixing Hacks Has Deadly Impact on Hospitals

Choi says that hospitals should be careful to focus changes in their security processes, procedures, and technology to improve both data security and patient outcomes. Ponemon sees healthcare organizations starting to improve in security. "We do see healthcare organizations starting to take care of security and rising to the next level of security. I think the public demands it," he says. Two factors contribute to the improvement across the industry, he says. The first is the simple acknowledgement that doctors and hospitals are targets - an acknowledgement that was a long time coming. The next is the march of technology. "There are technologies that healthcare can now afford because they're available in the cloud and it provides the opportunities for healthcare security to improve," Ponemon says. The improved security may come just in time to have an impact on a looming area of security concern: The medical IoT. "There's a universe of devices, many of which are implanted and many can be communicated with through WiFI or Bluetooth," Ponemon says. "Right now, the providers are looking at records but the devices are really an area of huge concern."


A Different Take on Voice Interfaces, IBM Launches Watson Assistant

(Image: Watson Assistant, courtesy of IBM)
IBM's deployment model is a little different from the other assistants, however. You won't be able to download an IBM Watson app directly from IBM, Greenstein said, or buy an IBM smart speaker. Instead, IBM is offering Watson Assistant to business/technology partners who will incorporate it into their own offerings. Those offerings will then be offered to consumers. For instance, an automobile company could embed the assistant in the car dashboard. One early partner is automotive embedded electronics company Harman. Hotels may offer the assistant in a mobile app that consumers can use to control the experience in their hotel rooms in terms of lighting, music, and other features. The service is delivered through the IBM Cloud, and IBM says it also incorporates contextual elements, such as delivering particular experiences to consumers based on their location and time of day, anticipating their needs and proactively making recommendations, the company said. For instance, a traveler may be listening to certain music in her rental car. Watson Assistant could then ensure that same music is playing in her hotel room once she checks in.


Is 2018 the Tipping Point in Digital Transformation?

Digital is destroying this equation by creating more value for customers than for firms. For example, digital competitors with niche products and agile delivery offerings are forcing organizations to unbundle profitable product and service offerings. This results in more freedom of choice for customers to buy only what they need (and not being forced to buy what they don’t need). This is shifting the profit pools and decision making away from the firms and towards the customers. Digital also renders physical distribution intermediaries obsolete. Consider, how healthy is your nearest big-box store? With digital distribution providing limitless choice and price transparency, digital offerings can be reproduced freely, instantly, and perfectly, shifting value to hyper-scale players while driving marginal costs – and product margins – towards zero. ... Profits are no longer distributed across a large number of participants. Think about how Amazon’s market capitalization towers above that of other retailers or how the iPhone regularly captures over 90+ percent of ALL the smartphone industry profits.


Why IoT security should keep you up at night

Why IoT security should keep you up at night
Despite the steady year-over-year growth in worldwide spending, Gartner predicts that through 2020 the biggest inhibitor to growth for IoT security will come from a lack of prioritization and implementation. This means that companies using IoT won’t follow security best practices and use the right tools in IoT planning. This will hamper the potential spend on IoT security by 80 percent, which means that the hackers will go after these connected devices, like they would a bank vault with a screen door. The fact of the matter is that keep things secure—your cloud-based systems, traditional on-premises systems, and now IoT devices—everything must be secure that is interconnected. Security is like the links of a chain: It’s only as strong as the weakest link. This weakest link is now typically a robot on the factory floor, the thermostat on the wall, or even the fitness tracking watches you’re probable wearing right now. Cloud computing security is holistic, meaning it needs to be systemic to all cloud-based platforms and workloads—including any systems connected to those workloads, and any devices connected to the cloud. 


How to Use AI and Blockchain in a Disruptive World

Actively seek out opinions that contradict your own. Don't allow yourself to live within an echo chamber, including only those opinions you already agree with. The truest measure of intelligence is the ability to hold two opposing views at the same time, so practice this. The world is becoming more decentralized, resulting in competing and vastly divergent realities. Nobody's willing to change their mind, and everyone has plenty of evidence that they're right and everyone else is wrong. This is a form of anarchy. Do it differently. Success amidst anarchy requires defiant leadership! The most strategic thing you can do is fine-tune your unique value proposition. Open a document on your computer, or pull out a notebook and write down your thoughts to this: Who are you? What do you stand for? Since the world is characterized by competing realities, you have to pick your reality and ignore the rest. You can't change people's minds. Build your tribe and ignore the haters.


What is the Open Compute Project?

big basin v2
When Facebook designed the hardware for its first dedicated data center in Prineville, Ore., it wanted to make savings on three fronts: energy, materials and money. It boosted energy efficiency by cutting wastage in the power supply and by making the servers taller, which left room for bigger, more effective heatsinks and meant that it could use fans of a larger diameter, able to move more air with less energy. By doing away with vanity faceplates, paint, logos, unneeded expansion slots and components such as video cards and even mounting screws, it saved more than 6 pounds of material per server. That inevitably led to cost reductions, as you don’t pay for electricity you don’t consume or parts you don’t use. On top of that, it made savings on labor: Without the mounting screws, racking and unracking servers was quicker; standardization saved time dealing with spares, and overall systems coulc be deployed more quickly. In its 2018 spending study, IHS Markit identified the three main barriers to the adoption of OCP hardware as being concerns about security, sourcing, and integration.


Time for Transformational Cybersecurity!

The exponential growth of cyber-attacks, as evidenced by newspaper headlines describing massive loss of our personal information, including credit information and passwords, is now a Presidential level challenge and has elevated cyber to a U.S. National Security warfare area. Not often discussed or considered is the fact that every cyber malware attack must borrow a Computer Processing Unit (CPU) instruction from the attack target system in order for attacking software to operate the malware instructions! In the physical world equivalent, such as bank robberies, criminals must borrow access to city streets, bank buildings, and bank vaults to conduct successful robberies. Fortunately, in cyber space, new synergistic technologies are now available to prevent malware from borrowing CPU instructions, thereby significantly enhancing cyber defense-in-depth. Unfortunately, most organizations are reluctant to purchase this enhanced cybersecurity because they are confused by all of the cybertool hype and fall back on the mythology that persistent cyber intruders will always win so what they have is good enough.


IT Leadership: Winning at What Cost?

Companies establish and center their culture on a code of conduct that highlights their company’s values. Values start skewing when winning comes into play. All for-profit companies strive to make money, grow business, and put competitors out of business. That is the nature of the game. Thus, winning is also where “shades of grey” can come into play, as sinful deeds can lead to gainful deals. These situations provide leaders with an opportunity to show their true character and deliver their core leadership values. Their team members are watching and observing the rules of engagement. What is allowed? What is a foul? And what falls in between? The simple question they want answered is: How is one rewarded within an organization by its leadership team? This question and its answer drive the behavior of the people within an organization and set the culture for the organization as a whole. For better or worse, leaders need to understand the dynamics with respect to delivering wins because too much rewarding for sins leads to a lethargy from team members who are playing it straight up.



Quote for the day:


"If no good can come from a decision, then no decision should be made." -- Simon Sinek


Daily Tech Digest - March 27, 2018

DDI Market: Rising Demand Of Industrial Internet Of Things (Iiot)


Global DDI (DNS, DHCP and IP address management) solution market has been segmented on the basis of component, application, deployment, size of organization, end use industry and geography. On the basis of component type, global DDI market has been segmented into services and solution. Additionally, global DDI market has been separated on the basis of application which includes virtualization and cloud, network security, data center transformation and network automation among others. Moreover, the evolution to IPv6 from the IPv4 Internet protocol (IP) is anticipated to contribute expressively to industrial demand. Across the globe, the cumulative focus on consciousness of impending implementations of IPv6, need to streamline IP address management and security risks among others important factors are expected to assist the growth of DDI service provider. Security plays a significant role in this development and is likely to drive the growth of DDI market during the forecast period.



A robot’s biggest challenge? Teenage bullies

There’s also the question of how to ensure that the robots’ interactions with people go smoothly. Hitch says that at first, some store employees are more reluctant than others to accept that the robots don’t pose any physical danger. Eventually, though, human workers warm up to their mechanical coworkers and often give the robots names, like Megan or Eric Jr. Robots are also learning to solve issues that are less simple, like the last-mile problem. Rui Li, ‎the CEO and cofounder of Robby Technologies, showed off a video of his company’s new Robby 2.0, a cooler-size robot on wheels that can deliver up to “70 liters of booze” (or any other order of that size). The robots say “Excuse me” when blocked by pedestrians, and “Thank you” if the people move. Li says besides making the robots polite, the researchers also trained them to stick to one side of the sidewalk, which helped let humans know how to interact with them. “This simple change in the behavior of the robot has solved a big problem,” Li said.


What is the right storage software needed for DevOps to be a success?

Storage Software DevOps
An effective storage platform should combine the performance, control and management of internal data centres with the agility and scale of public cloud. This provides organisations with the ability to build and run agile environments for cloud-native and mission critical applications in their own data centres. The attraction is that it helps to solve the fundamental mismatch between infrastructure and virtual applications and aids in an organisation’s preparation to adopt DevOps practices, which cannot be fully supported by traditional infrastructure. Businesses are increasingly attracted to the DevOps model as a means to accelerate development efforts and deliver new applications and services. DevOps essentially seeks to merge two personas into one and achieve the best communication and collaboration between developers who create platforms and runtimes, and operations teams, who lead configuration management. The requirements for DevOps is to have the ability to build applications with the latest production data, distribute updates quickly with more application testing in less time, accelerate the release cycle, speed up integration testing and reduce restoration time.


The Impact of IoT on Application Development

With the emergence of IoT, the inheritance of technology into our lives has gone up by a notch. In the simplest terms, IoT is the concept of connecting various things, or smart devices, as they are called, to the internet. The platform of The Internet of Things brings diverse information together and provides the common language of devices and apps to communicate with each other. IoT, since its arrival, has changed the paradigm of technology entirely. Even our daily routine activities have transformed since the arrival of IoT, from how we drive, to how we make purchases, to how we make use of energy in our homes. The arrival of IoT took place at a very convenient time, when users were looking for something to make their lives more convenient and easy, which is why IoT is such a big hit! You can check out the infographic for more details about IoT.


The Complexities of Scaling IoT Projects

The Complexities of Scaling IoT Projects
There must be a complete management UI that encompasses the ability to seamless connect to a wide number of PLCs, CNCs and robotic systems via edge gateways, with the ability to manage devices and deploy applications and analytics at the edge. The only way to successfully implement an IoT project that scales and allows the enterprise to incorporate new types of information is to use a flexible platform that can adapt to the future. Without such platform adaptability, application development means a lot of headaches. Companies risk being stuck with a static system, which forces them to constantly invest resources—significantly, time—in developing their next generation IoT products and services. There is a strategic dimension to developing adaptive IoT products. Applying good IoT techniques and design principles will probably generate revenue value-added services that will likely generate revenue, but for enterprises to have the biggest impact they will have to leverage the network effects of a big-data ecosystem, either pre-existing or self-created. The risks of not doing so, and the rewards of achieving this, should be apparent.


Facebook could be hit with $2tn fine after FTC inquiry

“The FTC takes very seriously recent press reports raising substantial concerns about the privacy practices of Facebook,” said Tom Pahl, acting director of the FTC’s bureau of consumer protection. The investigation is in response to “substantial concerns about the privacy practices of Facebook” and will look at whether Facebook engaged in “unfair acts that cause substantial injury to consumers” by sharing data with Cambridge Analytica for use in political campaigns without the knowledge of the data owners. A similar investigation has been launched in the UK by the Information Commissioner’s Office (ICO) which is charged with protecting the privacy of UK citizens. Facebook has responded to news of the FTC investigation by saying the social networking firm remains “strongly committed” to protecting people’s information and “appreciates the opportunity” to answer the FTC’s questions.


Skip containers and do serverless computing instead

Skip containers and do serverless computing instead
“Serverless” refers to services like AWS Lambda that offer developers a way to focus on writing application logic rather than server infrastructure. Yes, this means a developer must trust that AWS, Microsoft, or Google get that infrastructure right, but the upside to embracing these cloud back ends is huge. As such, Stackery told Governor, “Serverless is being driven by mainstream enterprises. We see them leapfrogging containers so they can take something off the shelf and move quickly.” In other words, they’d love to get into containers, but they may lack the expertise. So they’re borrowing that expertise from Amazon or another serverless vendor and skipping the container revolution. For those enterprises less willing to trust their application infrastructure to a cloud vendor, some have hoped to bring serverless “in house,” running it on-premises in a corporate datacenter, just as some hope to ape the benefits of public cloud computing in so-called private clouds in their datacenters. It’s a nice theory. Unfortunately, it doesn’t work. Not for most companies, anyway.


Five Steps to help your organisation implement Machine Learning technologies

Despite investing in machine learning, the new survey indicates that most CIOs do not have the skilled talent, data quality and budgets to fully leverage the technology. For most CIOs, many decisions still require human input. Only 8% of respondents say their use of machine learning is substantially or highly developed, as opposed to 35% for the Internet of things or 65% for analytics. According to a McKinsey study, the three main challenges companies have related to machine learning are designing an organisational structure to support data and analytics, having an effective technology infrastructure, and ensuring senior management are involved. The study then goes on to state that organisations that can harness these capabilities effectively will be able to create significant value and differentiate themselves, while those that fail will find themselves increasingly at a disadvantage. Achieving great value from machine learning doesn’t come from just investing in new technologies, it is also necessary to make significant organisational and process changes, including approaches to talent, IT management and risk management.


Unity Replaces Mono-Based IDE with Visual Studio


Reasons for the change, Unity Technologies said, include taking advantage of new features in the C# programming language starting with version 6.0, and the ability to leverage an upgrade to the .NET 4.6 scripting runtime, which is still in the experimental stage. The company said MonoDevelop-Unity 5.9.6, the latest version of the open source IDE to ship with Unity, doesn't support many of the new C# features and can't debug C# scripts in the new experimental scripting runtime. "It [is] very important for us at Unity that we also provide a great C# IDE experience to accompany the new C# features," Unity Technologies said. Microsoft yesterday noted that Unity development was one of the first scenarios supported out-of-the-box when Visual Studio for Mac was released last year and applauded the move to make it the default IDE for Unity going forward. "This means that everyone will be able to utilize the benefits of the .NET 4.6 scripting runtime upgrade in Unity (currently an experimental feature) , including all the goodies of C# 6.0 and access to the Microsoft Azure SDK to add powerful cloud services to your games," Microsoft said.


To protect artificial intelligence from attacks, show it fake data

Goodfellow is best known as the creator of generative adversarial networks (GANs), a type of artificial intelligence that makes use of two networks trained on the same data. One of the networks, called the generator, creates synthetic data, usually images, while the other network, called the discriminator, uses the same data set to determine whether the input is real. Goodfellow went through nearly a dozen examples of how different researchers have used GANs in their work, but he focused on his current main research interest, defending machine-learning systems from being fooled in the first place. He says for earlier technologies, like operating systems, defense of the technology was added afterwards, a mistake he doesn’t want made with machine learning. “I want it to be as secure as possible before we rely on it too much,” he says. GANs are very good at creating realistic adversarial examples, which end up being a very good way to train AI systems to develop a robust defense. If systems are trained on adversarial examples that they have to spot, they get better at recognizing adversarial attacks. The better those adversarial examples, the stronger the defense.



Quote for the day:


"People who think that they are being "exploited" should ask themselves whether they would be missed if they left, or whether people would say: 'Good riddance'?" -- Thomas Sowell


Daily Tech Digest - March 26, 2018

Threat Landscape for Industrial Automation Systems in H2 2017


It should be noted that the CVSS base score does not account for the aspects of security that are specific to industrial automation systems or for the distinctive characteristics of each organization’s industrial processes. This is why, when assessing the severity of a vulnerability, we recommend keeping in mind, in addition to the CVSS score, the possible consequences of its exploitation, such as the non-availability or limited availability of ICS functionality that affects the continuity of the industrial process. The most common types of vulnerabilities include buffer overflow (Stack-Based Buffer Overflow, Heap-Based Buffer Overflow) and improper authentication (Improper Authentication). At the same time, 23% of all vulnerabilities identified are web-related (Injection, Path Traversal, Cross-Site Request Forgery (CSRF), Cross-Site Scripting) and 21% are associated with authentication issues (Improper Authentication, Authentication Bypass, Missing Authentication for Critical Function) and with access control problems.



Global organisations are failing to invest in much-needed security ahead of GDPR

GDPR Security
Less than a third (31%) said they have invested in encryption, despite it being one of the few technologies named in the GDPR. Similarly, few organisations have spent money on data loss prevention (33%) or advanced technologies designed to detect network intruders (34%). A quarter of organisations (25%) claimed that limited resource are the biggest challenge to compliance, providing further insight into some of the reasons behind this under-investment. “The GDPR is clear that organisations must find state-of-the-art technologies to help repel cyber-threats and keep key data and systems secure. It’s concerning that IT leaders either don’t have the funds, or can’t find the right tools to tackle compliance,” said Simon Edwards, cyber security solution architect at Trend Micro. “Organisations need defence-in-depth combining a cross-generational blend of tools and techniques, from the endpoint to the network and hybrid cloud environment.”


To understand digital advertising, study its algorithms


Skinner invented a device, now known as a Skinner box, which standardised the process of behavioural experimentation. He used his boxes to control input stimuli (food, light, sound, pain) and then observed output behaviour in an attempt to link the one to the other. Though by no means perfect, the Skinner box was a big advance in the field. Dr Rahwan hopes to do something similar to software using what he calls a Turing box. This “box” is itself a piece of software. Place an algorithm in it, control the data inputs, measure the outcomes, and you will be able to work out exactly how it behaves in different circumstances. Anyone who wants to study an algorithm could upload it to a Turing box. The box’s software would then start running the algorithm through a standard data set of the kind it was designed to crunch. All face-recognition algorithms, for example, would be given the same scientifically validated set of faces. The algorithm’s output—in this case how it classifies different faces—would be recorded and analysed.


Where do mobile users fit in with SD-WAN as a service?


Two trends are causing network architects to take a closer look at how mobile users connect to and access company resources using modern technologies. The first is the fact that most workforces are becoming increasingly mobile in nature. Employees often have the need to work from home or on the go. These employees want to be able to seamlessly access business apps whether they're at home, at a coffee shop or in a taxi driving across town. The second trend is a movement toward the use of public cloud, as opposed the company's private data center. Common remote access network designs force users to connect to the corporate office network before accessing company resources. This is typically achieved by using remote access VPN client software. If the apps and data no longer reside on the corporate network, however, it's inefficient for users to connect to the corporate office first, only to be redirected back through the internet to public cloud resources.


Great expectations lead to a brave new world

Cloud adoption doesn't just happen; there is a reason for its use. It may be the development of a new application, it may be the end of life of existing hardware or it may be a strategic decision at the highest level. But the decision to use cloud is usually a perfect storm of factors – the current price of resources, issues in finding appropriate staff, a desire to scale and grow, or the ease of using a third party. The spark represents the 'why do this now?' The factors represent the 'why use cloud now?' A primary factor for cloud adoption is reduced costs compared with traditional models, as shown in this chart from Voice of the Enterprise: Cloud Transformation, Organizational Dynamics 2017. CIOs have great expectations with regard to the cost savings that might be achieved relative to traditional platforms. But cost isn't the only driver: The inherent nature of pay-as-you-go pricing and the scalability and time to market it enables are also major motivations, as are availability and performance.


Why Monero Is Going to War Against Big Miners

Largely referred to as monero's first move in a "war" against ASICs, the upcoming software upgrade will render the Antminer X3 ineffective. Not only that but to keep hardware manufacturers from catching up, these algorithm edits are planned to continue with bi-annual networks upgrades. Stepping back, the move is a defense of the mining made possible by monero's current algorithm, Cryptonight, which can successfully mine monero on consumer-grade laptops. Faced with competition by highly efficient ASICs, the fear is affordable laptop mining would be silenced. And that's not a development developers are taking lightly. "I will do everything in my power to help the community prevent the proliferation of centralization-inducing ASICs on the monero network," core developer Riccardo "Fluffypony" Spagni declared on GitHub. Currently issued by a sole supplier, Bitmain, concerns exist that the Antminer X3 could lead to certain kinds of attacks, namely ones in which a mining pool takes over the majority of a cryptocurrency's hashrate, creating false transaction histories, double spending coins and censoring payments.


Transparent Digital Transformations Mitigate Risk, Aid Business Objectivity


In order to stay competitive, business leaders need to ensure their organisations embrace new technologies, which are already digitally transforming their organisation and industry. However, without an accurate foundation of knowledge about their existing IT infrastructure, their technology procurement decisions will always be risky. Digital Transformation occurs in two ways. Firstly, organisations implement incremental improvements that help parts of the organisation to better perform their fundamental business tasks. Secondly, the organisation completely changes the way it does business—by adjusting its business model or taking advantage of new markets or products—which has the potential to transform the industry and disrupt several others. All large organisations run complicated IT systems that contain a mix of hardware, software and services from a variety of vendors. Making fundamental changes to any such system can result in unexpected side-effects, complications and costs.


The Power of Doubt in Software Testing

It starts with being skeptical of ourselves, with knowing our own biases. We can't trust our eyes, ears or even our memories. Realise that we get fooled easily, on a daily basis. Knowing that we are easy to fool keeps us on our toes, and forces our mind to work harder. I also try to be skeptical of what the majority believes. When you share your opinion with others, it becomes difficult to change your mind. It’s hard to argue against what everyone else believes, conventional wisdom and "accepted truths”. When one person posits a belief, there can be disagreement or debate. But when more than one person agrees that something is the truth, this often shuts down our own inquiry. My adventures in skepticism taught me that we should also be skeptical of certainty. The feeling of certainty is a tricky thing. Scientific studies have shown that, despite how certainty feels, it is neither a conscious choice nor even a thought process. Certainty arises out of brain mechanisms that, like love or anger, function independently of reason.


6 Myths CEOs Believe About Security

fact fiction debunk myths truth
Part of the reason for the nihilistic belief that hackers and malware can never be fixed is that the world thinks that hackers are all brilliant, can’t-be-stopped, super geniuses. This romantic ideal is readily promoted in Hollywood films that often show the hacker taking over the entire world’s computers by easily guessing passwords into any system they are presented with. Movie hackers outsmart everyone and can launch nuclear missiles and erase people’s digital identities with a few keystrokes. This mistaken ideal is believed because most people that get hacked or infected with malware aren’t programmers or IT security people. To them it’s sort of like a magical event that must have required Lex Luthor superpowers. The reality is that most hackers are average joes with average intelligence and are more akin to plumbers and electricians than Einstein. Hackers just know how to accomplish a particular trade using particular tools passed down by previous tradespeople, but instead of plumbing and electricity, it’s computer hacking. 


Behavior-tracking security tech gaining traction at banks

Statistics on biometric growth expectations in devices
“The thing about most of these behavioral biometrics is that they’re passive. They’re happening in the background and the end user doesn’t feel intruded upon,” said Kathleen Peters, Experian’s senior vice president of global fraud and security. BioCatch is now exploring other use cases, like authenticating new customers at the time of sign up, said Frances Zelazny, vice president of marketing at BioCatch. It is also expanding into other industries, such as payroll processing and insurance, she said. ...  Increasingly, companies in need of fraud prevention are asking their service providers whether they also offer those same behavioral biometrics protections. “If you’ve worked with product managers, you know it takes a while to integrate those capabilities and things that show up on product road maps, but we’re actually seeing these capabilities being integrated with other solutions and they’re live,” Pascual said. “That tells you that in this short time frame since BioCatch came into existence, how much interest there really is.”



Quote for the day:


"A bad habit never disappears miraculously; it's an undo-it-yourself project." -- Abigail Van Buren