Daily Tech Digest - October 16, 2017

Blockchain can fix the sorry state of the real estate industry
A number of blockchain startups are working on tokenizing real estate ownership to overcome these challenges and open real estate investment to more people. An example is BitProperty, a platform that enables property owners to register their property on the blockchain and issue tokens, digital currencies that represent a share of their property. When a person wants to invest in the property, they can purchase any number of its corresponding tokens on BitProperty. Contractors and construction companies can use the BitProperty to raise funds for their projects by launching initial coin offerings (ICO). Anyone who wants to invest in the project can purchase the project’s tokens. In return, they’ll have proportional share of the value and revenue of the finished project in the future.


For a time, Linux OS computers were expected to become the dominant third player in the PC market, Huang said. For example, many thought Linux would be the ideal OS for netbooks, which peaked in popularity in the late 2000s/early 2010s. But Microsoft captured that market by “bottom-ending” Windows to run on the low-powered portables. There’s still talk that Linux could emerge as the dominant OS for thin clients. But IDC’s data doesn’t support that belief. Linux comprised a 3% share of global PC shipments in 2013, but it’s held steady at a mere 1% since 2015 and is expected to stay at 1% through 2021. By comparison, Chrome OS has risen from 1% of the market in 2013 to its current position of 5.5% in 2017, and IDC expects it to reach 8% by 2021.



Cybersecurity: into the data breach
The vulnerabilities stakeholders face include cyber security, data privacy, data breaches, and payments fraud. The utmost vigilance is required to protect organisations against cyber attacks and all stakeholders, including regulators, must be more proactive regarding cybersecurity, with ownership of the issue taken to prevent attacks. In the new payments ecosystem, third-party developers can directly interact with a partner banks’ customers, raising questions about data privacy and security. In an increasingly networked ecosystem, identifying the source of attack will be a challenge. Verizon’s 2017 Data Breach Investigations Report found that security incidents and data breaches affect both large and small financial organisations almost equally. However, the security of larger banks is difficult to compromise as they invest more in cyber security solutions. Smaller banks, which do not have the same access to resources, are more prone to cyberattacks.


A soon-to-be published study shows how the traditional corporate human resources operation actually hampers cybersecurity hiring against a backdrop of the industry's well-documented talent gap. The Jane Bond Project report, commissioned by security talent recruiting firm CyberSN, found that in addition to the lack of available talent for those positions, respondents say their HR generalists are not equipped to recruit and hire cybersecurity talent, and that flawed salary data complicates their ability to issue the best job offers. More than 80% of the 83 cybersecurity positions studied in the report ended up with compensation offers higher than the salary caps stated in the original job descriptions. Half of the 52 organizations participating in the study say they had to up the compensation offers to seal the deal. The positions in the study include security engineers, product sales engineers, incident response analysts, SOC analysts, and product security experts.



REST-API-LEGACY-API
Obviously, no company is going to replace all of their hardware overnight, as it would require considerable expense, implementation and architecture challenges that, until resolved, could impact company operations. In addition, there would be plenty of non- technical issues, like employees knowing device X and networkOSY like the palm of their hands and not looking forward to the time it might take to learn new technology and processes. When a company decides to transform to a software defined networking infrastructure, they may not get support from their existing Network hardware vendor, which might beenjoying hefty margins in network hardware sales, not thrilled to push a technology that will make their expensive boxes replaceable for cheap vendor agnostic white boxes.


According to Badman, Extreme's Automated Campus initiative shows great promise due, in part, to 802.1aq shortest path bridging, which supplants routing protocols such as Border Gateway Protocol (BGP), MPLS and Open Shortest Path First (OSPF), thereby reducing complexity. The new network fabric also includes hypersegmentation to contain security breaches, APIs to increase interoperability, and user and device policies that drive automated network changes in conjunction with analytics and changes on the edge. Badman said he views Avaya as one of the leaders of software-defined networking fabrics, adding that Extreme has succeeded in integrating Avaya fabrics since it acquired the vendor. "I'm of the opinion that some vendors are trying to figure out how to proceed with network-wide fabric methods, while painting beta-grade efforts up with glitz and catchy slogans. This just isn't the case for Extreme," he wrote.


Will the Internet of Things rewrite the rules on cyber security?
Despite having capable teams of programmers and rigorous testing procedures, many companies – be they retailers, manufacturers, or service providers – still have a hard time seeing the potential vulnerabilities in their own systems. “There are a lot of companies who think ‘this will never happen’ and then they come back to us six months later saying ‘it happened’,” says Kupev. The challenge, he explains, is being able to look at things from a different point of view. “Often a client’s view of things can be quite narrow because they’re used to looking at things from the same perspective,” he adds. “Our job is to help them look at matters from a different angle and uncover vulnerabilities they would have otherwise missed.” To illustrate his point, Kupev tells the story of an engine maker that invested heavily in ensuring a device’s “regular” communications systems are secure.


The ability to decrypt packets can be used to decrypt TCP SYN packets. This allows an adversary to obtain the TCP sequence numbers of a connection, and hijack TCP connections. As a result, even though WPA2 is used, the adversary can now perform one of the most common attacks against open Wi-Fi networks: injecting malicious data into unencrypted HTTP connections. For example, an attacker can abuse this to inject ransomware or malware into websites that the victim is visiting. If the victim uses either the WPA-TKIP or GCMP encryption protocol, instead of AES-CCMP, the impact is especially catastrophic.Against these encryption protocols, nonce reuse enables an adversary to not only decrypt, but also to forge and inject packets. Moreover, because GCMP uses the same authentication key in both communication directions, and this key can be recovered if nonces are reused, it is especially affected.


clicks pageviews traffic denial of service ddos attack 100613842 orig
Metered DDoS pricing used to be more common, said Theresa Abbamondi, director of product management for Arbor Cloud and Services at Arbor Networks, Inc. That created a risk for customers, she said. Arbor has been pricing based on clean traffic when it launched its service four years ago, one of the first vendors to do so. "Most of the purpose-build anti-DDoS vendors quickly moved to this type of clean traffic pricing model, and it became the standard in the high end of the market," she said. "Among vendors like Cloudflare, who sell DDoS as an add-on service to a customer base more interested in the vendor’s core offerings, it’s still common today to see vendors limiting the total bandwidth of traffic they will scrub, blackholing traffic that exceeds that threshold, or hitting the customer with exorbitant, hidden fees," said Abbamondi.


New derived credential technology eliminates the need for a physical card by placing verified identity credentials directly and securely onto the mobile device, much as mobile-pay systems do away with the need to make payments using a plastic credit card. This technology offers the added benefits of making identity verification more convenient, and preventing unauthorized logins. But derived credentials and authentication tools such as biometrics offer only a one-time, “snapshot” form of user verification. Once the user has passed the initial test and gained access, the device and everything on it become fully available for viewing and use. Behavioral analytics promises to change this paradigm.  ... tools designed to capture how a device is used can provide the equivalent of a continuously-authenticating security “video,” to detect interlopers, transaction by transaction.



Quote for the day:



"Anger is the feeling that makes your mouth work faster than your mind." -- Evan Esar


Daily Tech Digest - October 15, 2017

Data governance stumble

Data governance involves data quality, ownership and security, metadata, and analytics processes. In most organizations, the word “governance” tends to throw off staff, who can become confused with what data governance entails in the organization and what their specific role is. In order to clear up the role of data governance in the business, it should be defined more in terms of data quality and how higher quality data can advance the efficiency of the business. High data quality should be the fundamental aim for any data governance campaign, and it should be the key area of focus. In fact, research by Gartner showed that poor data quality cost organizations an average of $8 million a year.
Challenges to the proper implementation of data governance Many businesses fall prey to spending too much time on defining a data governance model, such that they end up hindering their organization from becoming data driven.


How to shape your customer experience with big data


It’s clear that the volume of data is increasing, and shows no signs of slowing down. As such, brands will have to move beyond using data just to create internal reports to satisfy internal stakeholders; they need to start leveraging cross-departmental data to deliver insights and shape the customer experience better. This comes at the back of companies undergoing digital transformation in Asia Pacific, where the first wave of change was migration towards the cloud and expanding their digital capabilities. With the right tools, the next wave will see marketers using both public and owned data to drive measurable outcomes, and enhance customer experience through personalisation. It all sounds great, but where can companies start? Here are some big brands that have successfully managed to embrace data analytics to fuel efficiency and growth


EA - Why You Should Think About The Enterprise Continuum!

We were really thinking hard about whether TOGAF should stay in the Information Technology (IT) space versus the “enterprise” space. Most of us thinking we need to go beyond IT architecture. When we agreed to proceed in the enterprise direction many things emerged as important including thinking about services – not just IT services, but business oriented services. We thought a lot about building blocks. As a matter of fact, if you got a hold of an old version of TOGAF, you would see interesting treatment of building blocks and services, and how they would be used in the Architecture Development Method (ADM). Of course the subject of building blocks generated a need to distinguish between architecture building blocks and solutions building blocks, and their relationship. Additional discussion uncovered the observation that there were common problems across enterprises addressed by different architectures and solutions.


Innovation, Tradition, And Striking the Balance


My son turns eleven today. We are all set to celebrate as we always do – our kids love the traditions that come with birthdays, Christmas, Thanksgiving, college football, and too many other events to mention. The house is decorated exactly the same for every birthday. I’m told they love it that way. There will be a special dinner, as always. All this tradition and consistency got me thinking. My children certainly love new things and surprises: new adventures, trips to unknown places, crazy experiences. And still, for a handful of personal milestones, they seem to want- to need- something familiar and dependable. Certainly, that is to be expected. New experiences bring excitement, anticipation of something unknown, and the possibility of “total awesomeness” (which, I have to imagine, is what the kids are saying nowadays.) Those traditions, the patterns sought out by their own brains, bring them a sense of stability, safety, and comfort.


Why Marketing Needs AI

The largest costs in marketing are human-related, from people to make content at scale to running advertising programs. These costs scale upwards at a disproportionate rate to impact delivered; adding more marketers scales at best linearly, because humans only have 24 hours in a day and do any one task relatively slowly. Compare that with the capabilities of machine learning and artificial intelligence. If I have an analysis problem to solve and sufficient cloud computing infrastructure, instead of having one computer work on the problem, I simply “hire” thousands of temporary computers to instantly complete the job. Once done, those computers move onto other tasks. I could never hire thousands of people in a second and lay them off seconds later – but I can with machines. If all the tasks in marketing were ideally suited for the ways humans work, this solution wouldn’t be much of a solution at all.


Fintech: Too large to ignore, too complex to regulate


Technology-neutral regulation refers to a specific regulatory process under which rules and regulations prevent service providers from preferring one type of technology over another in offering their services, although some experts, such as Professor Matthias Lehmann of the University of Bonn, find the definition of technological neutrality to be ambiguous. While innovation used to be regarded more positively before the 2008 financial crisis, according to Arner, Patrick Armstrong, Senior Risk Analysis Officer on the Innovation and Products Team at the European Securities and Markets Authority (ESMA), pointed out that "regulations are there as a response to the market failure from 10 years ago".  He argued that, in dealing with fintech, regulators act differently, very much depending on the technology involved and the risk that it carries.


Fintech Malaysia Report 2017

Malaysia’s regulators in these few years has taken a open but cautious approach towards regulating fintech. Since the appointment of Tan Sri Muhammad Ibrahim as the new Governor of the Central Bank of Malaysia in 2016 we’ve seen several key reforms and regulations being introduced most notably was the announcement of the Malaysia’s fintech regulatory sandbox. The Fintech Sandbox is open to all fintech companies including those without a presence in Malaysia, however the prequisite is that said company must have a genuinely innovative solution that fills a gap in the market. They are not required to work with a bank but the Bank Negara Malaysia encourages it. Upon being approved to be in the sandbox the fintech companies has 12 months testing period.


Rational Agents for Artificial Intelligence


The path you take will depend upon what are the goals of your AI and how well you understand the complexity and feasibility of various approaches. In this article we will discuss the approach that is considered more feasible and general for scientific development, i.e. study of the design of rational/intelligent agents. ... There are 4 types of agents in general, varying in the level of intelligence or the complexity of the tasks they are able to perform. All the types can improve their performance and generate better actions over time. These can be generalized as learning agents. ... As the agents get complex, so does their internal structure. The way in which they store the internal state changes. By its nature, a simple reflex agent does not need to store a state, but other types do.


Can a blockchain tech revolutionize corporate deposits?

“What’s been happening over the last five years in the banking industry is banks have been reviewing customers and are looking more closely at the profitability of each client,” Aidoo said. “As a result, banks may turn away less profitable clients.” Yet Aidoo believes there is a solution: the Utility Settlement Coin, which relies on blockchain technology. ... The Utility Settlement Coin is a smart contract that is held at a central bank as collateralized cash. It lets banks accept deposits from corporations and turn some of them into settlement coins, which are really balances held at a central bank. Say there are five casinos in a small area and a gambler buy chips with U.S. dollars at a cashier window at one of them. If the casinos were all using Utility Settlement Coin, the gambler could go to any of the casinos with the same chips and they would honor them, letting the gambler exchange them for cash.


6 Industries That Could Be Forever Changed by Blockchain


The future is about to change with blockchain. A blockchain is essentially a continuously growing list of records, called blocks, which are linked and secured using cryptography. The first work on a cryptographically secured chain of blocks was described in 1991 by Stuart Haber and W. Scott Stornetta. While blockchain is still fairly new to most consumers, experts are beginning to understand that banking and payments aren't the only industries that could be affected by blockchain technology. Other industries could also be affected by this new phenomenon in the future. With every paradigm shift, there are winners and losers, and just as the internet disrupted the way we communicate, blockchain will disrupt a number of industries. The world's crypto-currency market is worth more than 100 billion dollars. Startups are already using blockchain to push transparency and trustworthiness within the digital information ecosystem.



Quote for the day:


"You grow up the day you have your first real laugh at yourself." -- Ethel Barrymore


Daily Tech Digest - October 14, 2017

Alibaba Aims to “Master the Laws” of AI and Put Virtual Helpers Everywhere

Jack Ma, Alibaba’s executive chairman, announced his decision to establish the Alibaba DAMO Academy (DAMO stands for Discovery, Adventure, Momentum, and Outlook) on the first day of the company’s 2017 Computing Conference, which opened on Wednesday. Ma said the academy will do research aimed at “solving problems” related to the Internet of things, fintech, quantum computing, and AI. It will open seven research labs in China, the U.S., Russia, Israel, and Singapore.  Chinese tech companies are increasingly looking to invest in cutting-edge research, especially artificial intelligence. Alibaba’s future has never been so closely intertwined with original research. The company already has more than 25,000 engineers working on applying AI advances to consumer products and cloud computing services.


Reframing growth strategy in a digital economy


Traditional strategic planning is important as a means of understanding the world of today. However, if you aspire to a strategy that will enable your company to achieve disproportionate growth and create competitive advantage, you need to push beyond pure analysis. What if the razor industry, dominated by giants Gillette and Schick, had looked beyond known competitors to anticipate the value in a direct-to-consumer subscription service? Would the e-commerce razor delivery company Dollar Shave Club have had such a meteoric rise? And would powerhouse Unilever, which acquired the startup in 2016, have expanded as meaningfully into the shaving business? Successful digital strategy requires a blend of deductive analysis and the type of inductive reasoning that powers the creative leaps that anticipate and open fundamentally new markets.


How Python rose to the top of the data science world

There are many interesting libraries being developed for Python. As a data scientist or machine learning practitioner, I’d be tempted to highlight the well-maintained tools from Python core scientific stack. For example, NumPy and SciPy are efficient libraries for working with data arrays and scientific computing. When it comes to serious data wrangling, I use the versatile Pandas package. Pandas is an open source library that provides fast and simplified data manipulation and data analysis tools for Python programming language. It focusses on providing realistic and high-end data analysis in Python. I’d also recommend Matplotlib for data visualization, and Seaborn for additional plotting capabilities and more specialized plots. And Scikit-learn is a great tool for general machine learning, which provides efficient tools for data mining and analysis.



The elastic edge of the new age network

The elastic edge of the new age network
Corporate boundaries now extend far beyond the four walls of the enterprise, and are becoming more fluid and elastic every day. They now incorporate mobile workers, vehicles, pop-up and temporary networks, kiosks, cameras and sensors, to name just a few. Organisations need a WAN with an Elastic Edge – one that effortlessly expands, contracts and stretches to connect any new endpoint, wherever that may be. The traditional branch-centric WAN was never designed to cope with these demands. Highly complex and inflexible, a traditional WAN often inhibits business agility. Increasingly, organisations are looking to a Network-as-a-Service (NaaS) model, where connections can be spun up and down, whenever and wherever they are needed. In this model, organisations can move away from a traditional build-your-own, fixed location network, to a model that can be designed to individual requirements and billed on a ‘pay for what you use’ basis.


5 Misconceptions about Shadow IT

While you might think traditional firewalls are inspecting traffic thoroughly, legacy security tools don't offer the kind of visibility and control organizations need when sending data back and forth to the cloud. For example, cloud resources are accessible by any user, anywhere. An on-premises firewall cannot register that exchange of information. What's needed are advanced security tools, like those available in a Next-Generation Security Platform. These have the ability to inspect every data packet coming into and out of your virtual environment, and to apply security and access policies consistently across physical and cloud resources. Products like App-ID from Palo Alto Networks identify all traffic and apps, so the IT team enjoys end-to-end visibility. In the Tech Pro Research survey, only 47% said they use next-generation firewalls for SaaS access



How Will AI in FinTech Benefit Consumers?


Areas in the financial industry are dependent on a set of complex rules. Humans have to manually review hoards of data. Human error in this process in inevitable. In some cases, the false positive rate of risk detection – the rate that humans accidentally detect a case of fraud – is as high as 60%. As financial services become more complex, and more people have access to these services, the amount of data banks are dealing with is increasing. The issue of human error will only become more critical. This is why smart systems can be so useful for consumers. They can increase the accuracy with which fraud is detected, and ensure that cases of fraud that are currently being missed are now captured. In the credit card space, for example, MasterCard traces card usage, and endpoint access.


API-Driven Development With OpenAPI And Swagger

Microservices and public APIs are grown from the roots of service-oriented architecture (SOA) and software-as-a-service (SaaS). Although SOA has been a trend for many years, widespread adoption has been hamstrung by SOA's complexity and overhead. The industry has settled on RESTful APIs as the de facto standard, providing just enough structure and convention with more real-world flexibility. With REST as the backdrop, we can create formal API definitions that retain human readability. Developers create tooling around those definitions. In general, REST is a convention for mapping resources to HTTP paths and their associated actions. You've likely seen these as HTTP GET and POST methods. What's key is to use HTTP itself as the standard, and layer conventional mappings on top of that for predictability.


At the intersection of technology and private equity


Private equity firms are attracted to cloud-based SaaS delivery models, which offer recurring revenue streams via continuing renewals of an existing customer base. Despite the lack of hard assets to lend against, the predictability of subscription-based revenue models offer private equity firms a reason to invest, to hold those investments for longer periods of time and provide a cushion from inflated valuations. All of which is not to say that there’s not the threat of fracture to this new private equity-tech partnership. Naysayers argue that funds are paying an egregiously unhealthy EBITDA multiple, and, in doing so, are creating a new bubble. In addition, for the private equity-tech relationship to work in the long-term, funds must be willing to assert unusual control over their investment targets, continue to invest in operationally sound playbooks and must help their portfolio companies nail the subscription model, which requires higher upfront costs.


Using Machine Learning To Multiply Your Digital Marketing ROI

Finding the right marketing messaging for all customer segments is a highly challenging and highly rewarding endeavor. Machine learning pilot programs are providing leading banks with a competitive edge while helping marketers better understand what types of marketing messages are preferred with various audiences. The latest digital advertising platforms and c0ntent management tools are making machine learning accessible and affordable to non-technical marketing teams in a variety of industries. With the adoption of these advanced marketing technologies, we are moving into a world in which the most visible marketing asset to digital marketers, the homepage marquee banner, will deliver more personalized messaging, better marketing results, and an overall better experience to all prospects and customers.


IIA research identifies barriers to business intelligence and analytics adoption

Want to achieve data-driven excellence? It's all about perspective.
While data preparation capabilities are critical to BI and advanced analytics adoption, the “softer” areas of culture, leadership and skills are also key. Even organizations that are strong in these areas say that they see significant challenges with things like innovation, creativity and leadership. As the report reveals, weak adopters of both BI and advanced analytics should see opportunities for executive support. Without a strong vision and buy-in at the executive level, resulting initiatives will naturally fail or underperform. To improve adoption, leaders need to openly demonstrate and quantify the value or metrics of success from these initiatives. As organizations pursue the path from BI to an advanced analytics continuum, the skills and competencies required from data scientists are broader than those of BI/reporting staff or business analysts.




Quote for the day:


"You don't have to know how you are going to accomplish your goals, you just have to know that you will." -- Mike Basevic


Daily Tech Digest - October 13, 2017

Digital banking priority: Make it personal

“We’re seeing a pretty big demand from banks for acquiring new data sources, investing in data and analytics tools and data-related services,” said Nilesh Vaidya, senior vice president at the technology and consulting firm Capgemini. “They’re trying to know their customers better, better identify customer segments and offer more customized products.” Vaidya said that the industry is still in the early stages of this evolution and that much of it is being done by top-tier institutions with larger IT budgets and resources — and it will take time before customers get an Amazon or Netflix-type experience from banks. “There’s a long way to go, but it is something that’s being driven not only [by technology people] but marketing departments and others that want improvements in how they target customers,” he said.


15 Essential Project Management Tools

15 essential project management tools
Top-level project managers are in high demand, thanks to the high-level leadership, knowledge and capabilities they bring to bear on vital business projects. But having the right set of tools is also essential to project management success. Project management tools and templates not only increase team productivity and effectiveness but also prepare the organization for changes brought about by high-impact projects. To perform at their best, project managers need to make the most of tools aimed at business intelligence and analytics, business requirements, change management and project management, as well as a wide array of forms and templates. Here we have compiled the ultimate project manager’s toolkit to help you plan, execute, monitor and successfully polish off your next high-impact project.


Here's Google's biggest secret to not failing at security

Yes, security tends to be viewed as a mundane and necessary evil, but in our world where everything connected to the internet can be hacked, it's suddenly sexy to be able to deliver real security. To better understand BeyondCorp and its implications, I sat down with Sam Srinivas, product management director in Google's Cloud Security and Privacy team. Srinivas came to Google from Juniper Networks where he was chief technologist in the Security Business Unit. He is also president of the security industry's FIDO Alliance, which is working on open standards for strong authentication. ... The fundamental idea is that access control should be identity and application-centric, not network-centric. The current model that depends on a remote access VPN connection to access applications give an all-or-none type of access that doesn't fit with the way organizations work today.


Hacking Is Inevitable So It's Time To Assume Our Data Will Be Stolen


As Telang sees it, a determined hacker is probably going to succeed, yet there’s far too little focus on limiting the damage. Credit freezes could be automatic, and wherever possible data could be aggregated to protect individual identities and private information. The types of fraud-protection services that Equifax sells to customers could be made available to victims as a default. Government intervention may be necessary, as consumers are vulnerable to the credit raters’ mistakes but have little choice but to accept their role in finance. Consumers aren’t really customers for Equifax—the company makes money from banks and credit card companies that buy data from it. US senator Elizabeth Warren has said she wants to see the consumer credit rating industry—which is more lightly regulated than banks and credit card companies


Big Data: Out of the Server Room and Into the World

Sensing a lucrative business emerging, most of the major technology companies have rushed in to create and refine new big data tools to satisfy business needs. Microsoft's Azure platform, for example, now offers a cloud-based service that aims to unify big data tools and applications for their customers. It includes tools to discover and classify data from a wide variety of data collection systems. This approach creates a data catalog, which is independent of data storage location and provides searchable, centralized access to all available business data. The end-user can then utilize the data they find in their own business application, as well as contribute new information to the set. Microsoft's hardly alone in the space, having already been joined by industry heavyweights including Oracle, I.B.M., Amazon, and SAP.


Cybersecurity Strategy, Risk Management and List Making

A person writing in a notebook.
Frameworks are becoming the strategic tools of choice to assess risk, prioritize threats, secure investment and communicate progress for the most pressing security initiatives. They provide assessment mechanisms that enable organizations to determine their current cybersecurity capabilities, set individual goals for a target state, and establish a cybersecurity strategy for improving and maintaining security programs. Frameworks help you understand the maturity of your security activities and can adapt over time to meet the maturity level of the threats you face and the security capabilities you employ. There are various security frameworks that look at different types of needs, but one of the most popular is the National Institute of Science and Technology’s (NIST) Framework for Improving Critical Infrastructure Cybersecurity


Awareness training is key to reducing security risk

This also needs to be part of a broader top-down effort starting with senior management. Awareness training should be incorporated across all organizations and not just limited to governance, threat detection and incident response plans. The campaign should involve more than serving up a dry set of rules, divorced from the broader business reality. If done the right way, employees will come away with a keen understanding how their cyber behavior can impact the overall business. According to the Global Cyber Security Capacity Centre, this hinges on the organization’s ability to influence attitudes as well as intentions. Unlike training, where employees are quizzed on their knowledge of instructions, the focus of awareness training should be on changing behavior. In terms of making this happen, organizations should make clear to everyone on staff that cybersecurity adherence isn’t optional any longer. It’s strategic.


SAML Explained: What It Is, What It's Used For

authentication
In order for SSO to work, a user must be able to authenticate once and receive authorization, based on his or her confirmed identity, to access multiple other computers. This can also work the other way: a single computer may provide services to users authorized on multiple other computers. The SAML standard defines how all these computers communicate with each other securely. ... A SAML assertion is the XML document by which all the information we've been discussing is transmitted from one computer to another. Once an identity provider has determined that you are who you say you are and have the right to access the content or services you're interested in, it sends a SAML assertion to the server that actually can actually provide those services to you. A SAML assertion may be encrypted for increased security.


What to do when SQL servers can't keep up with data demands

If anything is likely to change SQL server performance in the next few years, it will be the introduction of 5G connectivity and cloud-based systems. First, the launch of 5G will enhance remote system connection, breaking down front line communication delays necessary for server-side operations. Cloud storage, on the other hand, will be a boon to scalability. As with cloud-based SaaS, cloud storage is regularly updated, eliminating network upgrade delays and preventing slowdowns caused by insufficient storage within the system. Additionally, many companies prefer to operate via the cloud for security and stability reasons. Even with replication throughout, onsite physical operating systems tend to be much more prone to damage or failure than cloud storage. With more companies moving to cloud-based storage systems, choosing the proper protocols will be more important than ever


The Java Evolution of Eclipse Collections

Eclipse Collections is a drop in replacement for the Java Collections framework. It has JDK-compatible List, Set and Map implementations with a rich API, as well as additional types not found in the JDK such as Bags, Multimaps and BiMaps. Eclipse Collections also has a full complement of primitive containers. It was developed internally at Goldman Sachs for 10 years before being open sourced in 2012 as GS Collections. In 2015, it was migrated to the Eclipse foundation, and since then, all active development for the framework was done under the Eclipse Collections name and repository.  ... Optional is one of the most popular new features for Java 8. From the Javadoc, "A container object which may or may not contain a non-null value. If a value is present, isPresent() will return true and get() will return the value".



Quote for the day:


"The task of leadership is not to put greatness into humanity, but to elicit it, for the greatness is already there." -- John Buchan


Daily Tech Digest - October 12, 2017

App Economy Talent
Traditionally security was an afterthought in the development cycle, but over the past few years, it’s quickly become a core part of the process. Now aptly called DevSecOps, the process incorporates security earlier into the development and testing software phases as a means to achieve faster, higher quality outcomes that are both innovative and secure. While DevSecOps is growing in popularity, organisations are still struggling to combat malware injections or data breaches, because their developer and IT teams don’t have the security knowledge or skills needed to launch products threat-free. ... With more than half of organisations using DevOps practices across their business or within teams, the personal debt is bound to have a real impact on the productivity of businesses, the safety of its products, and the quality of applications that ultimately form the foundation of today’s digital economy.


Cybersecurity is key for the smart cities of tomorrow

Without a secure cyber foundation, smart cities will crumble. Built on a secure cyber foundation, smart cities will thrive. We were encouraged to see that the proposed legislation specifically focused on developing a “skilled and savvy domestic workforce to support smart cities.” At the heart of the secure smart cities of tomorrow will be a dynamic IT workforce, confident and capable of training and re-training on a consistent basis to stay ahead of the latest threats. Our research shows that just 35 percent of government officials believe that their organizations are well equipped to handle the cyber requirements of smart city projects. Moreover, 40 percent of government officials and personnel cite skills gaps and a lack of necessary technology expertise a primary concern affecting the expansion of smart city initiatives.


Enterprises 'radically outstripping' traditional technology: Nokia

nokia-communications-needs.png
"Nokia is seeing a watershed moment for the industry as significant global trends and changes to the cost base of what we have traditionally considered carrier capabilities are changing the dynamic of the way carriers are addressing the business market for telecommunications," he added. "The challenge now is for our industry -- carriers and suppliers -- to meet business halfway and ensure they understand that we are here to contribute to their future. "Industrial network requirements are rapidly shifting, and networks are changing to meet those needs." Labelling 5G as more than just the next evolution of the traditional network, Conway said it will accelerate this transformation of industry. "5G is specifically being designed to cater for the tens of billions of devices expected for our automated future," he said.


Cyber threats are among top dangers, says Nato

One of the biggest challenges is bringing innovation faster in Nato’s approach to cyber defence, he said. “This is one of the objectives where we still need to push a little harder,” he added. Ducaru said recognising cyber space as an operational domain requires a change of assumption. Previously, Nato worked under the assumption that it could rely on its systems and the integrity of the information, he said. “We concluded that this assumption was no longer valid, and that we needed to change our training, education and planning with the assumption that systems will be disrupted, that we will constantly be under cyber attack, and that we will need to achieve missions under these conditions,” he said. As a result, Nato has switched its focus from “information assurance” to “mission assurance” to support essential operations.


Converging big data, AI, and business intelligence

AI BI cross separate domains and systems
Although different GPU-based database and data analytics solutions offer different capabilities, all are designed to be complementary to or integrated with existing applications and platforms. Most GPU-accelerated AI databases have open architectures, which allow you to integrate machine learning models and libraries, such as TensorFlow, Caffe, and Torch. They also support traditional relational database applications, such as SQL-92 and ODBC/JDBC. Data scientists are able to create custom user-defined functions to develop, test, and train simulations and algorithms using published APIs. Converging data science with business intelligence into one database, allows you to provide for the criteria necessary for AI workloads, including compute, throughput, data management, interoperability, security, elasticity, and usability.


Olympic Games Face Greater Cybersecurity Risks

While most of the past attacks on sporting events center on IT systems at stadiums and ticket sales and operations, future cyberattacks at the Olympics may occur in eight key areas, says Cooper. The areas include cyberattacks to facilitate terrorism and kidnappings and panic-induced stampedes; altering scoring systems; changing photo and video replay equipment; tampering with athlete care food dispensing systems; infiltrating monitoring equipment; tampering with entry systems; and interfering with transportation systems. "I was surprised to learn there are instances where human decisions are overridden by technology," Cooper said, in reference to a growing reliance on using technology to make the first call in a sporting event, rather than a human referee. She pointed to the reliance of electronic line-calling technology Hawk-Eye that is used in such sports as tennis.


Why Machine Learning and Why Now?


Although machine learning has already matured to the point where it should be a vital part of organizations’ strategic planning, several factors could limit its progress if leaders don’t plan carefully. These limitations include the quality of data, the abilities of human programmers, and cultural resistance to new ways of working with machines. However, the question is when, not if, today’s data analysis methods become quaint relics of earlier times. This is why organizations must begin experimenting with machine learning now and take the necessary steps to prepare for its widespread use over the coming years. What is driving this inexorable march toward a world that was largely constrained to cheesy sci-fi novels just a few decades ago? Advances in artificial intelligence, of which machine learning is a subset, have a lot to do with it.


Creating a Strategy That Works

Distinctive capabilities are not easy to build. They are complex and expensive, with high fixed costs in human capital, tools, and systems. How then do businesses such as IKEA, Natura, and Danaher design and create the capabilities that give them their edge? How do they bring these capabilities to scale and generate results? To answer these questions, we conducted a study between 2012 and 2014 of a carefully selected group of extraordinary enterprises that were known for their proficiency, for consistently doing things that other businesses couldn’t do. From dozens suggested to us by industry experts, we chose a small group, representing a range of industries and regions, that we could learn about in depth — either from published materials or from interviews with current and former executives.


Understanding the hidden costs of virtualisation

Costs Virtualisation
Today, data underpins business continuity and therefore user expectations for server uptime are higher than ever before. More than at any time, the prospect of downtime is punishing for a company’s reputation and bottom line, meaning it must be avoided. This places added pressure on IT administrators to keep all machines up and running. Ideally, a fully dynamic and optimised infrastructure is achieved by an IT admin carefully running through a checklist or policy each time a new virtual machine (VM) is “spun up”. In reality, IT administrators are extremely strapped for time and can no longer afford to manually go through checklists. Instead, they are spending their resources on keeping the data centre lights on by ensuring users have access to the data and files they need to keep the business moving forward.


Much GDPR prep is a waste of time, warns PwC

Although some organisations claim to be following a risk-based approach to GDPR compliance, Room said that if that activity is not “anchored to a taxonomy of risk”, the activity is “purposeless”, and purposeless activity is one of the quickest ways of being hit by enforcement action, he said. For organisations that have not done any GDPR preparation with just seven months to go before the compliance deadline of 25 May 2018, Room said the biggest risk is that all the third-party service providers that could help have already been snapped up and are working to capacity. In addition to legislative compliance risk, there is also the risk of failing to deliver a GDPR programme, he said, and regulator risk because the Information Commissioner’s Office and all the other EU data protection authorities also form part of the spectrum of risks.



Quote for the day:


"The final test of a leader is that he leaves behind him in other men, the conviction and the will to carry on." -- Walter Lippmann


Daily Tech Digest - October 11, 2017

the-ultimate-devops-mastery-bundle.jpg
Development-centric folks tend to come from a mindset where change is the thing that they are paid to accomplish. The business depends on them to respond to changing needs. Because of this relationship, they are often incentivized to create as much change as possible. Operations folks tend to come from a mindset where change is the enemy. The business depends on them to keep the lights on and deliver the services that make the business money today. Operations is motivated to resist change as it undermines stability and reliability. When that disconnect is repaired, not only will both sides be less frustrated, they're able to work much faster. For the enterprise as a whole, DevOps improves both business agility and IT alignment. In other words, businesses can deliver improved services to users while using IT to more efficiently achieve business objectives.


CIO interview: Lee Edwards, IT director, NHS Shared Business Services

Robotic process automation (RPA) will be a key area of focus for Edwards’s team over the coming year when it comes to digital initiatives. This includes a pilot and implementation of RPA and chatbot technology across a number of operational areas.  “This will bring significant efficiencies to our back office functions and help improve customer services through improved speed and accuracy of processing,” says Edwards, adding that recent proofs of concept with RPA technologies such as Kapow and Blue Prism, integrated into a Microsoft Azure technology stack.  According to Edwards, there is a lot of potential in joining RPA and technologies such as artificial intelligence (AI) and big data analytics to bring noticeable improvements to back-office processes and customer engagement experience at NHS SBS.


Why Your Business May Not Be Ready for AI

Image: Shutterstock
"Generally, the biggest impediment we see out there is the inability to create a portfolio of initiatives, so having a team or a number of teams coming back and saying, 'These are the 50 things I could do with AI based on what AI is able to do today and in the next 12 months,' and then [it's up to senior management to] prioritize them," said Giacomelli. "You need to have people going through the organization, unearthing places where value can be impacted." Over the last three decades or so, business leaders have been setting strategy and then implementing it, which isn't going to work moving forward. The AI/human equation requires a hypothesis-driven approach in which experiments can fail fast or succeed. "It's a lot more about collective intelligence than let's get a couple of experts and let them tell us where to do this. There are no experts here," Giacomelli said.


Artificial Intelligence: Experts Talk Ethical, Security Concerns

"What's core is making sure there's clear accountability, and being concerned with the types of controls we seek in AI," Sharkey continued. There is a need for deep learning, and deep reinforcement learning, as we seek AI applications in child care, elder care, transport, and agriculture. "Future-proofing" AI should consider its implications for human rights. "Artificial intelligence transforms everything around us; every industry, our health, our education," explained Aleksandra Przegalinska-Skierkowska, assistant professor at Kozminski University and research fellow for Collective Intelligence at MIT. "Especially if we want autonomous vehicles or virtual agents, we need a code of conduct for them." We are at a point when people have begun to reflect on issues related to machine ethics and morality, she added.


Business needs help to act on cyber security advice


“Part of our role is also to cut through the noise and show businesses that the things that will make a difference for the majority of small businesses cost little or nothing to implement.” Many of the things small businesses can do to improve their cyber security only have a cost in time and effort, said Unsworth. “Cyber security is not always about buying a technical solution,” he added. Investments in security technologies depend on the size of the business, the business operating model and what the business is trying to achieve, he said. “So for businesses that handle sensitive information, there is a cost because they need to ensure that data is protected and demonstrate that they have a good security posture.” The role of the London DSC is to identify and prioritise business needs in terms of cyber security controls, said Unsworth.


Is Your Company Ready to Face Tomorrow’s Security Risks?

IT security strategy should take into account a few important factors, such as business and corporate strategy, IT strategy, compliance and standards, regularly repeated analysis of threats, risks and current security state. The starting point for building an IT security strategy should be the determination of goals and direction of the company and its business. Then, the assessment of the current security state should take place. Within this assessment, deep knowledge of the company, its processes, functions and business is needed. The security strategy should always be compatible with the business and company strategy, taking into account future plans and products. If we know where we are, and we understand where the company is heading, we can start working on specifying the desired state of the company’s security and methods


Businesses and their employees admit to GDPR confusion

GDPR Employee Confusion
It is not just inside the office where people need to consider compliance, as 30% of people admitted to having viewed someone else’s laptop during their commute. Businesses must ensure that even when working on the go, their employees are preventing others from obtaining information on their customers and contacts. The new regulations will protect consumers against companies that hold inaccurate and unneeded data about them, as well as ensuring greater emphasis is put on prominent and unambiguous customer consent with the ability to withdraw at any time. ... “Despite the impending GDPR deadline, our research shows that many companies don’t appear to have systems and policies in place to protect sensitive information. If this data is then stored illegally – or falls into the wrong hands – the damage caused to the organisation could be irreparable.”


SailPoint survey reveals cyber risks are leaving IT in the dark

According to SailPoint’s 9th annual Market Pulse Survey, senior IT decision makers say that more cyber risks are being created outside of the IT department’s visibility; yet it remains IT’s responsibility to mitigate these risks. The survey found that organisations need to better define and enforce corporate policies company-wide, addressing risks like shadow IT and bring your own device (BYOD) given today’s increasingly mobile, agile workforce. Recent sprawling attacks like WannaCry that affected organisations worldwide, as well as direct attacks on organisations of all sizes and in all industries, have demonstrated the significant organisational damage they cause. In the wake of data breach pandemics at levels seen over the past year, most organisations should take stock of the security controls they currently have in place


Artificial Intelligence Threats and Promises

Pepper robot and young girl
Artificial intelligence is unleashing the next wave of digital disruption. Early adopters are already creating competitive advantages. Companies that combine a strong digital foundation with an aggressive adoption of AI are well poised to become leaders in their industry. This potentially allows old-world companies a way to remain relevant in the wake of global competition. The adoption of AI is rapidly reaching a tipping point. A Narrative Science survey last year found 38 percent of enterprises are already using AI, growing to 62 percent by 2018. Forrester Research predicted a greater than 300 percent increase in AI investments from 2016 to 2017. IDC estimated the AI market will grow from $8 billion in 2016 to more than $47 billion in 2020.


Five Key Differences Between Personal and Enterprise Endpoint Security

Most home computer users have administrator rights on their devices. Administrators can install software, which means that attackers can install malware by compromising these credentials. People working in corporate environments rarely have administrator access to their own machines, so they cannot update the operating system or critical programs on their own. As a result, phishing and social media attacks are somewhat more likely to succeed in a consumer environment. However, attacks that manage to infect corporate devices are much more difficult to remove and usually require intervention from those responsible for centralized computer management. If a vendor issues a patch for a significant vulnerability or newly discovered exploit, those responsible for updating systems must do so quickly. 



Quote for the day:


"The actions of a responsible executive are contagious." -- Joe D. Batton


Daily Tech Digest - October 10, 2017

IT spending increases for software-defined storage, on-demand services
SDS is gaining popularity because of its versatility in a modern data center. Enterprise storage has been migrating from hardware-defined arrays as data centers migrate to virtualization and cloud-based infrastructure. SDS solutions run on commodity hardware but use virtualization and all functionality, such as provisioning and de-duplication, via software. This adds automation and thus speed to storage networks. "For IT organizations undergoing digital transformation, SDS provides a good match for the capabilities needed — flexible IT agility; easier, more intuitive administration driven by the characteristics of autonomous storage management; and lower capital costs due to the use of commodity and off-the-shelf hardware," said Eric Burgener, research director at IDC, in a statement.


Rise in Insider Threats Drives Shift to Training, Data-Level Security

With an insider threat, the culprit is already inside the network. Securing the perimeter around the network — which has long been the focus for enterprise security — does not do the job against this kind of a threat, whether it is malicious or unintentional. Nor is focusing on securing the perimeter the best strategy against many external threats. That's because data-smart companies want to be able to safely give partners, suppliers, and customers access to their networks in order to increase business opportunities. As a result of this shift, security needs to rest with the data itself, not just at the network level. The move to the cloud elevates the need for data-level protection. To reduce the risk of insider threats, companies and organizations need to focus on three areas


Understanding Cloud Native Infrastructure: Interview with Justin Garrison and Kris Nova

A major benefit of public cloud comes from process rather than performance. ​The people hours you can save from becoming an infrastructure consumer rather than an infrastructure builder will be very difficult to calculate but will likely enable a new method of working that far outweighs the technical limitations of a public cloud. Not to mention some of the best infrastructure builders and maintainers in the world work at public cloud providers and the companies behind them spend billions every year building out the infrastructure, R&D, and new features. ​The biggest considerations when building your own cloud is not what it will cost you to build the private cloud, but what it will cost you to maintain it and what happens when you fall behind public cloud offerings.


Make Cybersecurity A Priority in a Small Business’ Early Stages


The need for strong passwords is crucial for cybersecurity, no matter how often we groan about having to change (and remember) a new one. Shubhomita Bose writes about this and data from Headway Capital for smallbiztrends.com. The Headway infographic emphasizes having a company policy to avoid “weak” passwords, to change passwords on a regular basis, and to incorporate “two-factor authentication” — as some businesses are now doing with an additional text-message step in the password process. This is an increasingly significant threat to cybersecurity. ... As Anita Campbell, CEO of Small Business Trends, writes for Inc.com, “The ransom is displayed on the screen with a message stating you must pay a fine or fee in order to access your own system. Ransoms have ranged from hundreds of dollars to tens of thousands of dollars.”


Leaving employees to manage their own password security is a mistake

“Far too many organizations are leaving the responsibility for password management to their employees and don’t have the automated password management technology in place to identify when things are going wrong.” “In many cases, an organization’s password management practices are overly reliant on manual processes and far too often place an excessive level of trust in employees to use safe password practices,” said Matt Kaplan, GM of LastPass. “The threat posed by human behavior coupled with the absence of technology to underpin policy is leaving companies unnecessarily at risk from weak or shared passwords. Organizations need to focus on solving for both obstacles in order to significantly improve their overall security.”


How IPv6 deployment affects the security of IoT devices


As a result of their vast address space, IPv6 devices are provisioned with at least one unique global address and, thus, NATs are doomed to disappear. Therefore, a NAT's enforcement of the filtering policy to only allow outgoing communications is also likely to disappear, meaning communication between internal and external systems may no longer be policed by the network. In fact, the distinction between internal and external networks may disappear altogether if a filtering policy is not enforced at the network border. While this could have potential benefits -- for example, for peer-to-peer applications, in which unsolicited inbound communications are common -- this clearly comes at the expense of increased attack exposure.


Organizational Culture Needs To Change So That Security And DevOps Can Exist In Tandem

Cloud adoption often started to be called in terms of ‘shadow IT’ or ‘bypass IT’. So cloud adoption often occurred outside of the mainstream IT and mainstream IT security groups. So in a sense IT and IT security are still playing catch up to the original adoption of cloud. Even if they have been given responsibility for it now. And we have started to see that change. 2 years ago even in the US often it was – we were working with those ‘shadow IT’ projects. Now the responsibility is more moving into IT and IT security. So they’re bringing the traditional mindset. I think the remaining roadblock that you are still getting is the developer pipeline is moving at a much faster pace than it did historically where application introduction used to occur maybe in months


Intel plans hybrid CPU-FPGA chips

Intel plans hybrid CPU-FPGA chips
“The advantage for FPGA is GPUs play in some areas but not all, and if you look at the use model of inline vs. offload, they are limited to offload mostly. So, there’s a broader application space you can cover with FPGA,” he said.  The integrated solution provides tight coupling between CPU and FPGA with very high bandwidth, while the external PCI Express card is not as tightly coupled. For ultra-low latency and high-bandwidth applications, integrated is a great fit, Friebe said.  “Most of the differentiation [between integrated and discrete] is due to system architecture and data movement. In a data center environment where [you] run many different workloads, you don’t want to tie it to a particular app,” he said.  The more you do specialization, the more performance you can squeeze out of the accelerator, said Friebe. 


The future of mobility: Are we asking the right questions?

One of the categories requiring the sharpest questions about the future is mobility. The mobile present has many moving parts and is very complex, but base patterns are discernible. I believe every human on this planet needs at least to attempt to comprehend the current point to which the mobile revolution has brought us. Furthermore, I believe modern executives have a fiduciary responsibility to think long and hard about where the mobile revolution is taking us.  The most rapidly adopted consumer technology in the history of mankind, mobile technology has had a huge economic impact — more than $1 trillion — and has changed the corporate competitive landscape as well as how people live their daily lives. Some go so far as to argue that mobile technologies have changed what it is to be human.


Detecting and Analyzing Redundant Code

A typical analysis would involve running the tool repeatedly to prune back the source tree as brutally as possible. This was then followed by several cycles of reverting changes so as to get successful builds and then passing tests. The reasons for failure being that the tool had behaved incorrectly or there was a known limitation, examples of the latter being reflection or the existence of a code contract. The tool was trained on various GitHub repositories for C# projects that were chosen on the basis that I had used them and thus wanted to contribute back. Ultimately a pull request was submitted to the community asking for discussion of the changes in my branch. As the tool is brutal and I was engaging online with people for the first time this was where diplomacy was required and hopefully I didn’t offend too many people.



Quote for the day:


"When you're around someone good, your own standards are raised." -- Ritchie Blackmore