Daily Tech Digest - May 07, 2018

The Impending Facial Recognition Singularity

Facial Recognition
The result is that you will be identified every time your picture appears. If you use a real photo as your avatar, then your accounts will be connected, even if you use different names, account IDs, and email addresses. Even if you post an untagged photo of yourself to a site, the surrounding text will probably allow the system to know that you are in the picture somewhere; and after a hand full of pictures, it will be obvious to the computer which face is yours. On the opposite side of the spectrum however, an account with absolutely no photos might prevent identification, but it will stand out as fake.  The offline situation is no better. Cameras are becoming so inexpensive that they are built into all kinds of things. Cameras can be a cheap and easy way of allowing computers to sense and react to their environment. For example, cameras have been built into thermostats, smoke detectors, door bells, and toys. Overtime, all of the camera equipped IoT devices have created an Internet of Cameras (IoC). When paired together, government and private cameras provide almost complete coverage of our lives. Soon we will be seen and recognized everywhere we go.


Building Cybersecurity Shock Absorbers For The Enterprise

cyber resilience shock absorber
You know your data best, he continues, you know which “systems...are most important, what is the downtime that you can afford to have, what is the data move, where does the data exist.” Outside parties aren’t in your company every day. The only way they understand your priorities is through you. That doesn’t mean you shouldn’t look beyond yourself for advice. Building resiliency across the entire organization takes everyone. Non-security colleagues may have better ideas than you think. Mignona Cote, global head of identity and access management for insurance company AIG, notes that there’s a department in every business that’s mitigated risk much longer than infosec: accounting. “The finance people have been control people for years, way before we were,” she explains. “When I was an IT person and tried to do something with numbers or whatever, it always knocked the general ledger out of balance and people would come looking for me. They actually knew how to look at the logs -- the transaction logs -- which [security] never really embraced. There's a level of control that we need to focus on outside of what we typically do as IT professionals.”


Google could be getting serious about IoT with release of Android Things

Raspberry Pi 3 and Android Things
The idea behind Things is to provide a unified, one-size-fits-all software option for the developers of constrained devices like smart displays, kiosks and digital signage, among others. Device makers won’t be allowed to modify parts of Android Things’ code, specifically the parts that ensure Google can flash updates to all devices running the software at any time. That’s a potentially major sea-change for the IoT should Things use become widespread. If security is far and away the biggest stumbling block to IoT deployments, the inability or unwillingness of some device makers to regularly update their software to patch known security holes is arguably the biggest part of that problem. Regular, guaranteed software updates could go a long way toward making IoT more attractive to the more risk-averse enterprise and industrial users that will account for all that exponential growth being predicted for the IoT marketplace. There’s many a slip ‘twixt the cup and the lip, of course – Things is architected as more of an entry-level, consumer-style product at this point, for starters. But the multiplicity of developer sessions scheduled for it at this year’s I/O conference suggests that Google is serious about moving the framework forward as an option for device makers, and broadening its appeal among them.


How Mobile AI Will Transform Our Lives


The future of mobile AI is rapidly progressing. Businesses involved in the component manufacture and app development for the mobile phone industry aim to make improvements in the following areas. Better components and hardware features improve the ability of a mobile device to gather information from its surrounding environment. Previously, the phone camera was just a way to capture images and record videos, while the microphone was a way for the user to communicate during calls. In the mobile phone of the next generation, the camera and microphone will act as the eyes and ears of the intelligent phone. These components are expected to give the phone the ability to become aware of the world around it and make recommendations for its users’ benefit. Add the face recognition and GPS location feature to the mix and we come very close to a device that can understand its users’ wants and act as an assistant rather than just a communication device. The face recognition feature is particularly useful, as it would give the phone the ability to recognize the user’s emotions. The device would know when the user is sad, happy, or hungry.


How to Collect Meaningful Data

The hard truth here is that bad data leads to bad decisions. Thus, it is important to take the time necessary to build a proper data collection process. Two weeks ago, as I completed my big data certification, the importance of proper data collection became clear. It also reminded me of some basic data collection techniques I learned during Six Sigma training. That's what I want to share with you today. There are many benefits to building a proper data collection process. The primary benefit will be to the teams that need to sift through the data for insights. The sooner they get value from the data, the better. This saves time and money for everyone involved. Having a proper data collection process allows you to document what data is being collected, by whom, and for what purpose. Your data collection process should be part of a larger data governance strategy. Unfortunately, data governance is one of those things that happens after a company grows to a certain size. (So is data security, but I digress.) Here's a simple process outline for you to review. It's worked well for me over the years. Feel free to adopt or change for your own needs. Use whatever you can to build a data collection that helps you gather meaningful data.


Deep learning comes full circle

Artificial Intelligence digital concept; illustration of brain as connected network
Whatever the underlying reason, insights gleaned from the 2014 study led to what Yamins calls goal-directed models of the brain: Rather than try to model neural activity in the brain directly, instead train artificial intelligence to solve problems the brain needs to solve, then use the resulting AI system as a model of the brain. Since 2014, Yamins and collaborators have been refining the original goal-directed model of the brain’s vision circuits and extending the work in new directions, including understanding the neural circuits that process inputs from rodents’ whiskers. In perhaps the most ambitious project, Yamins and postdoctoral fellow Nick Haber are investigating how infants learn about the world around them through play. Their infants – actually relatively simple computer simulations – are motivated only by curiosity. They explore their worlds by moving around and interacting with objects, learning as they go to predict what happens when they hit balls or simply turn their heads. At the same time, the model learns to predict what parts of the world it doesn’t understand, then tries to figure those out.


The Hub Problem with Distributed Backup

Organizations need an alternative to using the primary data center as the centralized hub. The cloud may be the ideal hub. In a cloud model, IT sends primary data center and remote office data to a public cloud provider, which acts as the centralized repository. Data is copied once and there is one primary store of all backup data. Some solutions will cache data at each remote office and the primary data center so that restores of recently protected data can be quickly serviced but the actual movement of data is just one step. A DR copy can automatically be created by replicating the cloud copy within the cloud infrastructure. The other advantage of using the cloud as the hub is that it almost guarantees remote management will be of high quality since all sites essentially require remote management. It also means that traveling or vacationing IT personnel can remotely manage the primary data center data protection process. Using the cloud also better positions the organization to adhere to various legal and regulatory standards that dictate where data can reside since the larger public cloud providers have multiple data centers in multiple regions.


Why Network and Security Operations Centers Should be Doing More

Even though a NOC or a SOC consolidates a variety of tools and measurements into a single management system, they are still too isolated. Rather than this siloed approach, what’s needed is a new approach, with a system that can bring security visibility and control into the NOC, and provide operational requirements and network and workflow visibility to the SOC. By combining these systems into a single, holistic solution organizations can focus on the bigger picture of “secure throughput” that can streamline operations while managing and even anticipating critical security events.  This new approach could also help overworked IT teams operate with the benefit of the other’s perspective, and enable organizations to realize a new level of protection and operational management that can simultaneously adapt to network changes. Not only will this added insight allow organizations to see events more clearly, but it also enables the development of effective automation that allows the network to respond to an event at digital speeds without impacting critical business processes. 


Eight things to expect at Google I/O 2018


I/O will be more consumer and developer facing, so we should expect to hear more about products like Google Lens, as well as the company’s TensorFlow platform and its Tensor Processing Unit chips. Those chips are the core of the company’s specially designed AI training systems, and they help the company accelerate the learning process for its neural networks. Also expect to hear a lot of the same grandiose predictions about AI that we heard onstage at Facebook’s F8 developer conference last week, when executives also described AI as the future of Facebook’s business. Of course, it’s no surprise that Google and Facebook compete for top talent, as both companies have rival AI research divisions that command some of the highest salaries in the tech industry. ... Google Assistant and the Google Home hardware family it primarily lives on are slated to be big consumer-facing focuses for the company at this year’s I/O. Assistant remains Google’s largest competitive push against Amazon’s Alexa and, to a lesser extent, Apple’s Siri and Microsoft’s Cortana. And while Assistant does live on iOS and Android devices as an app and voice interface, it’s most readily useful as the OS layer for any number of smart home devices, starting with Google’s smart speaker family.


The 7 Fundamentals of IT Consultant Success

The 7 fundamentals of IT consultant success
Don’t understate the value of the insights you gained working in their industry, Perkins says. “They are what will differentiate you in the early days of your consulting career,” he says. “Others will know the methods, tools and craft skills of consulting, but few will have the depth of industry-specific insight you bring to the table. Trade on this.” As you develop a sense of which industry sectors most interest you, seek out assignments that will extend your expertise, Perkins says. “Your value increases the deeper you go,” he says. “And conversely, actively manage yourself away from industry specializations that don’t interest you.” Early in his consulting career, Perkins was assigned to two large agricultural chemical clients in a row, and was beginning to be referred to as the “AgChem” subject-matter expert. “Nothing wrong with AgChem, but I fancied myself a financial services technology strategist and took steps to gain experience in other areas,” he says. “At the same time, though, don’t neglect the emerging technologies and methodologies that will keep you attractive to a broad range of client and assignment types.”



Quote for the day:


"When leaders are worthy of respect, the people are willing to work for them. When their virtue is worthy of admiration, their authority can be established." -- Huananzi


Daily Tech Digest - May 05, 2018

Besieged Cambridge Analytica Shuts Down

Besieged Cambridge Analytica Shuts Down
Tuesday at Facebook's F8 developer event, the social media giant announced a number of measures to put the control of data use back in the hands of the user, including the ability to scrub all data. "Cambridge Analytica should be viewed as a cautionary tale for any firm handling personal data," says Julie Conroy, director at Aite Group. "Just as the rash of breaches took cybersecurity to a C-suite and board-level issue over the past few years, the firestorm around Cambridge Analytica's various abuses illustrate why consumer data control and privacy also need to be top of mind issues for all company executives." When the news of the Facebook data leak scandal broke in March, the scale of the impact and aftershocks became quickly apparent. Facebook's CEO, Mark Zuckerberg, eventually testified before U.S. House and Senate committees about the firm's privacy practices. Because Zuckerberg has failed to appear before Collins' committee, despite repeated requests, Collins warned Facebook in a Tuesday letter that he's prepared to issue a summons for Zuckerberg's appearance.



What is IO Acceleration? – JetStream Software Briefing Note

Caches are small and volatile; IO Acceleration is large and durable. Caches were designed when memory based storage was very expensive. If the organization could access 50% of its IO operations from cache that was considered effective but it still meant that 50% of the traffic had to cross the network and access data from hard disks. In-memory caches are not durable, meaning that power loss means data loss. The potential for data loss meant they were not safe for write caching so all writes had to go to the hard disk tier. While writes make up less of the IO distribution of the typical environment, they are the slowest part of the IO chain. Flash writes data slower than reads and each write often has an additional set of writes associated with flash management and data protection. The typical sizing of IO Acceleration, on the other hand, allows it to service 90% or more of all read requests and its design lets it work with a variety of storage devices including all-flash arrays and even the cloud. IO acceleration also has durability; protecting data outside of the system on which the acceleration software is installed so that power failure or even server failure does not result in the loss of data.


Blockchain: Prep starts now; adoption comes later


To avoid investments in hardware, early blockchain experiments will likely take place on pay-per-use models such as public cloud. While this will allow projects to scale, companies will have to contend with issues related to costs, security, data privacy, compliance, and vendor lock-in. In the meantime, what should companies do to prepare for blockchain? Now is a good time to start evaluating use cases. Vendor or conference workshops can help educate IT and line-of-business executives to what blockchain can do today and get started on documenting the processes for specific use cases. Workshops are an opportunity to get both internal and external parties involved. Generally speaking, companies wouldn't use a blockchain inside an organization. A stronger value proposition is provided by a consortium blockchain that crosses multiple organizations, which establishes a trusted mechanism for recording transactions, implementing smart contracts, and building other blockchain applications.


Digitization makes the Supply Chain agile and customer-related

Internet of Things
Industry 4.0 truly adds value to operations by providing the capability of analysing large amounts of data. Big Data analytics is one of the pillars of this new revolution and supply chain personnel need to understand that there would simply be more information coming their way. Everything involved in a process, right from a warehouse rack, to a guillotine machine, to a supply container, will have the ability to communicate, which will then require analysis and the CSCO needs to be ready for this. Highly automated process equipment and complex IT infrastructure does not eliminate the need for workers. On the contrary, it creates the need for highly skilled workers, who can effectively utilise the information available at their disposal. Future workforce would need to be competent at problem solving and systems engineering. It is crucial for a leader to understand the current workforce and their capabilities, in order to help modify the existing human resource to be ready for the challenges Industry 4.0 brings. Another key aspect for the CSCO to consider would be the end-to-end visibility across the supply chain.


Why Google Assistant could help make Android wearables more business-friendly

With the latest changes to Wear OS, Google has added two features to make using your watch as an organizational tool easier and more precise. Smart suggestions will generate options to narrow a query, with Google using the example of asking Wear OS about the weather. When a user asks about the weather, the current temperature and conditions appear on the screen—nothing is new there. What is new are suggestions available with a swipe up from the bottom of the screen: An evening forecast, weekend weather, and other recommendations appear as tappable buttons. Suggestions are available for various interactions and functions, similar to Google Assistant suggestions on Android smartphones. Google said it designed smart suggestions for quick interactions on the go, which can be great if you don't want to have an extended conversation with your wrist in public. The second productivity feature Google added to Wear OS is spoken responses, which can be a huge boon for busy people. Instead of displaying text for certain interactions on the screen, Assistant will now speak out loud via a watch's internal speaker or connected Bluetooth device.


9 machine learning myths


Machine learning is proving so useful that it's tempting to assume it can solve every problem and applies to every situation. Like any other tool, machine learning is useful in particular areas, especially for problems you’ve always had but knew you could never hire enough people to tackle, or for problems with a clear goal but no obvious method for achieving it. Still, every organization is likely to take advantage of machine learning in one way or another, as 42% of executives recently told Accenture they expect AI will be behind all their new innovations by 2021. But you’ll get better results if you look beyond the hype and avoid these common myths by understanding what machine learning can and can’t deliver. Machine learning and artificial intelligence are frequently used as synonyms, but while machine learning is the technique that’s most successfully made its way out of research labs into the real world, AI is a broad field covering areas such as computer vision, robotics and natural language processing, as well as approaches such as constraint satisfaction that don’t involve machine learning. Think of it as anything that makes machines seem smart.


NSA: The Silence of the Zero Days

Many organizations would do well to focus more on locking down their systems, and worry less about whether they might get targeted by a zero-day attack. "At the end of the day, if you're bleeding from the eyeballs, just stop the bleeding," BluVector's Lovejoy told me. But as the Equifax breach dramatically demonstrated, it's tough to keep track of all patches. According to software vendor Flexera's Secunia research team, the number of documented, unique vulnerabilities in software increased from 17,147 in 2016 to 19,954 in 2017 - a 14 percent increase - across about 2,000 products from 200 vendors. The good news, Flexera's Alejandro Lavie told me at RSA, is that "86 percent of [newly announced] vulnerabilities have a patch available within 24 hours of their disclosure." But as the NSA's Hogue warned, patches can be quickly reverse-engineered by hackers - criminals, nation-states or otherwise. So organizations need to do a better job of hardening their hardware and software, including not only tracking but also applying patches everywhere they're required, as quickly as possible.


GDPR could be Facebook's toughest data management test yet


One of the most heated exchanges came between conservative minister Julian Knight and Schroepfer, the article said, with Knight saying Facebook was a "morality-free zone," destructive to privacy, and not an innocent party that was wronged by Cambridge Analytica. "Your company is the problem," he said. Facebook’s vice president and chief privacy officer Erin Egan and vice president and deputy general counsel Ashlie Beringer recently posted an update about its GDPR compliance plans and new privacy protections. They introduced new “privacy experiences for everyone on Facebook” as part of GDPR compliance, including updates to its terms and data policy. All users will be asked to review information about how Facebook uses data and make choices about their privacy on the social network. The company said it would begin by rolling these choices out in Europe. "As soon as GDPR was finalized, we realized it was an opportunity to invest even more heavily in privacy,” the posting said. “We not only want to comply with the law, but also go beyond our obligations to build new and improved privacy experiences for everyone on Facebook.”


A Multi-Gateway Payment Processing Library for Java

J2Pay is an open source multi-gateway payment processing library for Java that provides a simple and generic API for many gateways. It reduces developers' efforts when writing individual code for each gateway. It provides flexibility to write code once for all gateways. It also excludes the effort of reading docs for individual gateways. ... While working on J2pay, you will always be passing and retrieving JSON. Yes, no matter which format is native to a gateway API, you will always be using JSON, and for that, I used the org.json library. You do not have to worry about gateway-specific variables like some gateways returning transaction IDs as transId or transnum. Rather, J2Pay will always return transactionId, and it will also give you the same formatted response no matter what gateway you are using. My first and favorite point is that you should not need to read the gateway's documentation because a developer has already done that for you (maybe you are that developer who integrated the gateway).


How to master GDPR compliance with enterprise architecture

With the complexity of modern IT services and the increasing amount of data obtained by companies today, it’s not uncommon to lose visibility into everywhere information exists — and for data to float to unexpected areas — especially within large organizations. The first step towards achieving full compliance is establishing a clear view of your data — where it lives, how your company processes it and how to quickly access it to make key changes. While a daunting and time-consuming task, leveraging EA and application portfolio management (APM) tools can help you gain full visibility into your organization’s data landscape. Regardless of your existing EA sophistication, taking an application-centered approach will create a strong foundation for success. First, identify all existing applications inside of the organization. Use surveys of application owners to uncover which applications involve personal data as defined by the GDPR, ensure that consent has been received by all data subjects and identify all business capabilities that use the impacted applications.



Quote for the day:


"A leader is always first in line during times of criticism and last in line during times of recognition." -- Orrin Woodward


Daily Tech Digest - May 04, 2018

7 Ways To Embrace Shadow IT & Win

7 ways to embrace shadow IT and win
Direction on how to deal with shadow IT tools is best obtained by asking users to discuss the value the technology is delivering to them and the specific problems it's helping to solve. "It's similar to what our IT teams do when evaluating new technologies, except that the new technology is already part of some business workflow," says Sean Cordero, head of cloud strategy at Netskope, a cloud security platform provider. "If it turns out your team can’t deliver the capabilities needed, then it’s likely a good time to dig further into the use cases and identify solutions that can meet the business' needs." A top shadow IT example is surreptitious use of public cloud services. Employees often share files, offer multiple users document access or simply back up important files to services such as Dropbox or Google Docs. "While these platforms are ubiquitous and easy to use, they can put sensitive data at risk," Green warns. He notes that enterprise-focused cloud platforms offer more robust security and utilization controls, including options to encrypt files so they can be accessed only by intended parties.



We're going to kill off passwords and here's how, says Microsoft

"Nobody likes passwords. They are inconvenient, insecure, and expensive. In fact, we dislike them so much that we've been busy at work trying to create a world without them -- a world without passwords," said Karanbir Singh, principal program manager for enterprise and security at Microsoft, in a blog post. Singh said the goal was to make it possible for end users to never deal with a password in their day-to-day lives, and to provide instead user credentials that cannot be cracked, breached, or phished. For Microsoft, multi-factor authentication and biometrics is seen as a good replacement for passwords -- using a physical key, and/or your face or fingerprint to log into your device instead of a string of letters and numbers. Singh said that Microsoft's Windows Hello biometric log-in is now being used by over 47 million users and that more than 5,000 businesses have deployed Windows Hello for Business, which is used on over one million commercial devices. Another technology in the mix is the Microsoft Authenticator app, which allows you to access your Microsoft account using your mobile phone.


How mobile money is spreading


Both the “Chinese” and the “Kenyan” models have crossed borders. Most developing countries have a mobile-payment service, but Sub-Saharan Africa is the only region where the share of adults with a mobile account exceeds 10%. Tencent has an e-payment licence in Malaysia where it plans to launch WeChat Pay—its first foray outside China and Hong Kong. Alipay has taken a higher-profile approach, enlisting merchants in Europe and America to accept it as a means of payment for the benefit of Chinese residents and tourists. And in Asia itself, Ant Financial has been investing in local mobile-payment services in India, Indonesia, Malaysia, the Philippines, Singapore, South Korea and, most recently, Pakistan. ... It is hardly surprising that many in this industry, rooted in charitable development work, feel ambivalent about vast commercial enterprises entering the payment business. The suspicions are not confined to Pakistan, and are likely to become more acute as American and Chinese tech giants slug it out for market share in poor countries. As a still largely nascent market of enormous potential, Pakistan also illustrates many of the other tensions affecting the payment business.


No Computing Device Too Small For Cryptojacking

It is unclear how many IoT devices an attacker would need to infect with mining software in order to profit from cryptomining, Merces says. A lot would depend on the type of device infected and the cryptocurrency being mined. "[But] a big botnet with a few thousands of devices seems to be attractive to some criminals, even though some of them disagree." Not all of the cryptocurrency malware that Trend Micro observed is for mining. Several of the tools are also designed to steal cryptocurrency from bitcoin wallets and from wallets for other digital currencies like Monero. But a lot of the activity and discussions in underground forums appear centered on illegal digital currency mining. And it is not just computers that are under threat but just about any internet-connected device, Trend Micro says. "The underground is flooded with so many offerings of cryptocurrency malware that it must be hard for the criminals themselves to determine which is best," Merces says in a Trend Micro report on the topic this week.


Google releases open source framework for building “enclaved” apps for cloud


The SDK, available in version 0.2 for C++ developers, abstracts out multiple hardware and software back-ends for applications so they can be easily recompiled for any of them without a source code change. There's also a Docker image provided via Google Container Registry that includes all the dependencies needed to run the container on any environment that supports TEE. "Asylo applications do not need to be aware of the intricacies of specific TEE implementations," wrote Google Cloud Senior Product Manager Nelly Porter and other members of the Google Cloud team in a blog post published today. "[Y]ou can port your apps across different enclave backends with no code changes. Your apps can run on your laptop, a workstation under your desk, a virtual machine in an on-premises server, or an instance in the cloud." The current Asylo implementation provides enclaves through the use of a software back-end. "We are exploring future backends based on AMD Secure Encryption Virtualization (SEV) technology, Intel® Software Guard Extensions (Intel® SGX), and other industry-leading hardware technologies 


New Research Finds C-Suite ‘Infosec Averse’


When asked which part of their organizations’ demographics were more infosec-averse, 41 percent laid blame at their fellow C-suite counterparts. In fact, management as a whole, from C-level executives down to junior department heads, were cited as the most likely to flaunt security risks and leave data vulnerable. Day-to-day knowledge workers, who are often charged with being most likely to cause security problems, were cited by only 25 percent of respondents. Security C-suiters demonstrated a varied but sophisticated view of the risks posed by inefficient security. When asked what was their greatest concern regarding security, 26 percent cited the possibility of fines or other sanctions. In contrast, 42 percent of infosec executives instead cited a potential loss of stakeholder and customer trust as the most concerning potential repercussion. In third place was a loss of employee trust, noted by 16 percent of respondents. This number varied by age, with older infosec executives being more likely to cite stakeholder and customer trust as a greater concern, while youngers executives were more concerned about fines.


Strategies to master continuous testing

If your enterprise delivers new software code several times a day, iteratively and agilely updating applications, you're not alone. A growing number of businesses focus on uninterrupted, continuous software delivery and deployment. This process sounds great, until you realize that continuous delivery (CD) can also mean constant bugs and hiccups. Continuous testing is the only way to avoid delivery failures. If you can test at the same speed that developers build code, your chances of catching bugs greatly increase. This Software Development Training Center entry covers strategies to implement, improve and assess continuous testing. Learn about continuous testing in DevOps, how to test with Jenkins and where continuous integration (CI) and continuous development fit in.


Crypto flaw in Oracle Access Manager can let attackers pass through

Oracle Access Manager CVE-2018-2879
“The Oracle Access Manager is the component of the Oracle Fusion Middleware that handles authentication for all sorts of web applications,” SEC Consult researcher Wolfgang Ettlinger explained. “In typical scenarios, the web server that provides access to the application is equipped with an authentication component (the Oracle WebGate). When a user requests a protected resource from the web server, it redirects her to an authentication endpoint of the OAM. The OAM then authenticates the user (e.g. with username and password) and redirects her back to the web application. Since all the authentication is handled by a central application, a user only has to authenticate once to access any application protected by the OAM (Single Sign-On).” But the vulnerability can be exploited to decrypt and encrypt messages used to communicate between the OAM and web servers. The researchers have managed to construct a valid session token and encrypt it, then pass it off as valid to the web server. This allowed them to access protected resources as a user already known to the OAM.


Rise of the decentralized and distributed mesh computer

Companies have been embracing cloud computing for nearly a decade, but it’s currently being disrupted by the IoT phenomenon. Analysts are predicting that there will be 75 billion internet-connected devices by 2025. The cloud was not designed for massive sensor data uploads, nor was it designed for low-latency, real-time communications. This is the catalyst for all IoT platform vendors racing to release edge computing gateways and appliances to bring more connectivity and computing capabilities to edge networks rather than routing everything through “the cloud.” ... With over 75 billion internet-connected devices expected by 2025, there’s going to be a ton of idle/wasted CPU resources and an insatiable demand for machine learning computes! We are moving into an era of decentralized and distributed computing where everything computes (together) as if they are peer-to-peer nodes on a global mesh computer. Decentralized web and decentralized apps will run on this new decentralized and distributed mesh computer. 


Is Payments Industry Ready for New Encryption Protocols?

Is Payments Industry Ready for New Encryption Protocols?
Dr. N. Rajendran, chief technology officer at National Payments Corp. of India, which is migrating to the new TLS protocol, notes: "The challenge for most organizations is to migrate their legacy systems to a new protocol; the entire process is ... investment intensive." But Tim Sloane, vice president of payments innovation at Mercator Advisory Group, points out: "It would be a sad commentary if acquirers are almost a year behind Salesforce.Com and others in upgrading to the more secure TLS 1.1 or higher. If acquirers or merchants haven't already deployed, or at minimum haven't got a plan to deploy, TLS1.1 or higher, then they have been asleep at the security switch and don't deserve to receive PCI compliance." Adds Julie Conroy, research director at Aite Group: "While we've known about this deadline since 2015, there are always laggards around various aspects of PCI compliance, and this is no exception. The problem of merchants running behind on security has been compounded as so many micro-merchants have come into existence over the past few years. Most of them believe they're too small to be on hackers' radar; ..."


Microsoft Wants to Secure IoT and ICS Devices With New TCPS Project

Microsoft engineers have started working on a new project codenamed TCPS —short for Trusted Cyber Physical Systems— that is intended to provide a hardened system for securing Internet of Things (IoT) and Industrial Control Systems (ICS) devices. Microsoft formally announced TCPS at the Hannover Messe 2018, a trade show for industrial technology that took place last week. ... Normally, good IoT and ICS systems utilize various security features to protect data in transit (data moving between devices — e.g., use of HTTPS encryption) and data at rest (data stored on a device — e.g., cryptographic file signatures). According to Microsoft, the purpose of its new TCPS project is to add support for the last missing piece in IoT and ICS systems design —protection for data in execution— by utilizing TEEs, similar to how they're used on desktops and laptops. Microsoft cited the recent attacks with the Trisis/Triton malware as the reason it started working on TCPS.



Quote for the day:


"An intelligent person is never afraid or ashamed to find errors in his understanding of things." -- @BryantMcGill


Daily Tech Digest - May 03, 2018

The Art and Science of Action-Driven Visual Analytics

clip_image002
In the world of big data, a visualization is merely a vehicle – a vehicle for us to create patterns, familiarity, and salience with data so that we can attract users’ attention and tap into their iconic memories, but to convince them to take actions, we must think deeper and tap into their short-term and long-term memories: Who is my audience? Why should they care? Will I make their jobs easier and help them create more impact? With this framework in mind, let’s look at the two data visualization examples below and see which one is more effective? For illustration purpose, let’s assume that the user for this data visualization is a project manager at an IT Consulting Agency. Her performance is measured by the number of projects she does and how quickly she delivers solutions to her customers. To achieve high impact, she constantly looks for areas that hinder her effort or projects that drag down her performance. ... The magic of action-driven visual analysis is never about the beauty of the chart, but rather the thought process that goes behind it: identify what is important for your audience, and then use visualization tools to surface what they care about.



A Simple DNS Configuration Change Can Reduce Your Risk

Quad9 blocks against known malicious domains, preventing your computers and IoT devices from connecting malware or phishing sites. Whenever a Quad9 user clicks on a website link or types in an address into a web browser, Quad9 will check the site against IBM X-Force threat intelligence that includes 800+ terabytes of threat intelligence data including 40B+ analyzed web pages and images and 17 million spam and phishing attacks monitored daily. Advanced analysis is performed on IP addresses to assign a risk score based on text, visual object recognition, optical character recognition (OCR), structure and linkages to other sites, and the presence of suspicious files to identify malicious IPs. This data feed combined with multiple other threat intelligence providers allows Quad9 to block a large portion of the threats that present risk to end users and businesses alike. It’s worth noting that Quad9 doesn’t just use IBM’s threat intelligence – there are 18 other combined feeds that make up their threat blocking, which is fairly unique and gives a cross-section of blocking abilities from some of the world’s best threat management organizations.


Do spreadsheets have a role in project management?

null
Whatever business you are in, it’s likely that somewhere in your organization there is a person or team responsible for project management.  If they are handling multiple projects, juggling the grouping of projects, overseeing work flows and allocating tasks, their job becomes more about resource planning. That means they need a firm grip on who is doing what, where, and when – and must determine whether all resources are being used in the smartest way.  They can use a variety of tools to help organize the resources at their disposal, which may be people, equipment, machinery, or office space. Some people schedule resources using Excel spreadsheets and an assortment of other unsophisticated tools, including calendars, whiteboards, and notepads. Whether these choices are made because of economy, or lack of knowledge of a better alternative, their failure to use specialist resource planning software is probably costing them time, money and the respect of team players and senior management.


Business Architecture At Raytheon: A Conversation With J. Bryan Lail


If someone truly means Enterprise Architecture as the roles, processes, value streams, business capabilities and ecosystems for your manufacturing, supply chain, finance, human resources, or everything your company does, then absolutely Business Architecture is part of that. It’s about translating from the vision and strategy level in Architecture through an understanding of the business needs and gaps before then architecting, or guiding, specific solutions, where solutions include process, roles, systems, information flow, and the technology. In that context, we’ve been building Business Architecture into the TOGAF ecosystem with a set of guides on how to apply Business Architecture as a very strategic tool and methodology. This enables the architect to flow from strategy around the ADM wheel to drive to the right solutions. The case study we presented uses those same Business Architecture methods, following step-by-step, including examples in Raytheon for Sales and Operations Planning. We walk through business modeling at the Vision Phase, then through value-stream analysis and business capability mapping in the Business Architecture phase.


Data Analytics and The Death of The Modern Banking Industry

It is a future where individual consumers sit at the center of their personal worlds and access the services that fit best into their lives thanks to the data about themselves that they choose to share with brands that they trust. Moreover, we are talking about trust on a personal, emotionally-engaged level. Not just the trust we have with a utility-style process that will work the exact same way the next time we need it. Although banks hold huge quantities of transactional data on millions of customers, they already face serious challenges to maintain the quality of that data and the way the data is used on behalf of the consumer. As customers turn to new payment methods, banks progressively lose the granular detail they used to have about their customers’ spending. Instead, they see a stream of transactions where anyone but the banks ‘owns’ the relationship: ApplePay, PayPal transfers, a direct debit to a Nutmeg or Betterment account, or storing value on a Starbucks mobile app … with rewards associated with many of these relationships. As such, a consumer can leave a bank in every way that matters without closing their account.


Optimism and Trust on the CEO’s Mind


The public mistrust of companies is also part of a longtime trend, one of declining respect for all institutions, not just corporations. According to Edelman, the mistrust of media is even greater than the mistrust of business. In the 28 geographies that Edelman surveyed, the overall trust for institutions accrued most to NGOs, then to business, then to government, and finally to media. In 21 of these geographies, business is more trusted than government. In that context, when it comes to dealing with social issues and fostering overall economic growth, people around the world increasingly expect business to step up to the responsibility. Other institutions have lost their license to lead; they aren’t seen as capable of making the right things happen. Two possible reasons for this shift in attitude come to mind. First, the private sector is now seen as an effective actor when business leaders choose to participate in solving the pernicious problems of our time: cybercrime, terrorism, the threat of nuclear war, income inequality and its political impact, and environmental damage.


Microsoft Releases .NET Framework 4.7.2


Microsoft's new .NET Core 2.0 and .NET Standard 2.0 offerings may be generating the most buzz among .NET developers these days, but for many use cases the traditional .NET Framework is still the best choice, just released in version 4.7.2. The new .NET Framework 4.7.2 is the next major update following the October 2017 release of v4.7.1, which added support for .NET Standard 2.0, defining APIs that all conformant .NET implementations must provide to ensure consistent API usage across development projects, replacing the previous Portable Class Libraries (PCL) as the means to create libraries for all scenarios. While .NET Core offers cross-platform functionality and more, the 16-year-old .NET Framework is still an optimal choice for targeting Windows desktop projects such as WinForms, WPF and ASP.NET WebForms apps. Both .NET Core and .NET Framework are used for creating server-side apps ... Microsoft also provided guidance about when and when not to consider porting existing .NET Framework projects to .NET Core.


A proper DevOps feedback loop includes business leaders


It's a step up from waterfall, where processes add significant time to project completion in the name of stability. But DevOps itself is already wrong for the modern world: Streamlined processes between development and operations are only useful if the outcome supports the business. DevOps does not solve the problem of IT effectiveness, wherein IT must not just work quickly, but also must stay attuned to business requirements and project goals. DevOps is better christened BizDevOps, as everything that happens must be driven by the business. Development teams can act too selectively: Instead of focusing on an issue identified by operations as critical, they spend time on technically interesting and intellectually challenging tasks that are less pressing. The standard help desk feedback loop system, wherein operations and users raise issues in production, is wrong for DevOps. A DevOps feedback loop enforces priorities and project goals so that the freedom and fast pace in development doesn't lead it astray.


RPA is poised for a big business break-out

RPA is poised for a big business break-out
"It was scary for a lot of people," Thompson said. He ultimately reassigned those workers to engage with the company's clients and perform other higher value tasks. "Our business leaders are coming along for the journey," Thompson said. "They didn’t think these things were even possible and we’re now showing them the art of the possible." CIOs aren't looking to shed staff so much as free workers up for other work. To that end, bots are a big part of the plans for Walmart, which employs 2.3 million people. Walmart CIO Clay Johnson, who spoke on the panel along with Thompson, said the retail giant has deployed about 500 bots to automate anything from answering employee questions to retrieving useful information from audit documents. "A lot of those came from people who are tired of the work," Johnson said. Freeing up staff is part of Johnson's process automation plan to make Walmart's massive workforce more efficient. More broadly, Johnson's IT strategy entails delivering IT services as a series of products rather than traditional IT project management freighted with set deadlines and rigorous processes.


TigerGraph: The parallel graph database explained

TigerGraph: The parallel graph database explained
We don’t describe TigerGraph as an in-memory database, because having data in memory is a preference but not a requirement. Users can set parameters that specify how much of the available memory may be used for holding the graph. If the full graph does not fit in memory, then the excess is stored on disk. Best performance is achieved when the full graph fits in memory, of course.  Data values are stored in encoded formats that effectively compress the data. The compression factor varies with the graph structure and data, but typical compression factors are between 2x and 10x. Compression has two advantages: First, a larger amount of graph data can fit in memory and in cache. Such compression reduces not only the memory footprint, but also CPU cache misses, speeding up overall query performance. Second, for users with very large graphs, hardware costs are reduced. For example, if the compression factor is 4x, then an organization may be able to fit all its data in one machine instead of four. Decompression/decoding is very fast and transparent to end users, so the benefits of compression outweigh the small time delay for compression/decompression. In general, decompression is needed only for displaying the data.



Quote for the day:


"Wherever there is authority, there is a natural inclination to disobedience." -- Thomas Haliburton


Daily Tech Digest - May 02, 2018

Next Port of Call — Digitization of Automotive Retail

Image Attribute: Inside a car showroom / Source: Mercedes-Benz of Encino/Flickr
As per the Cox Automotive's survey, for every retail sale, customers visit the auto dealer only two to three times (at maximum), including to sign the contract and to take the custody of the vehicle. However, the consumers are also taking the unbeaten path - like - initiating the buying process online by “building a vehicle” to their specifications and then searching inventory in a specific geography. The buyer evaluates their current vehicle’s trade-in value based on its model, option content, age, and condition. The financial institution (either traditional bank or newer online lenders) reviews, selects and approves financing and the consumer’s choice of purchase or lease in real time. Then the purchase process shifts from digital to more traditional retail, when the consumer arrives at the dealership to test drive the vehicle and sign the necessary paperwork to take ownership. Some dealers, taking advantage of their close proximity to the customer, further emulate the new online purchasing model by delivering the vehicle directly to the customer’s home at no charge.


Resolving who actually owns security in agile development

As we know, the developers’ main focus is getting a working product out the door as fast as possible, while the security folks want to reduce the chances that the product will contain vulnerabilities. Ideally, the developers would be able to code without any interruption or inference from the security folks. However, since developers are only human, there will always be flaws in the code that they write themselves, as well as issues in the code that they take from third-parties like open source repositories from sources like GitHub. We know that it is cheaper in terms of time and money to catch and fix vulnerabilities early in the process rather than later, especially when your developers have built more features on top of imperfect code. Moreover, we see a bottleneck occurring when security issues are left unaddressed until a short while before release (when stress levels are particularly high).


Shifting a Corporate Culture at Scale — and with Speed


Speed was very important in decision making. The culture of the prior organization was to extensively “discuss and deliberate.” As an example, the first meeting I was at had 25 people. My first call had 100 people. People were coming into meetings who were not necessarily contributing but who were transcribing and communicating to others; they weren’t the people who were supposed to take the action. One of the first meetings I had on July 14 was a review of the business. I had a stack of paper on one side, a stack of paper on the other side. I said, “I’m going to make a policy decision: no more paper.” And of course, I got a call that evening, saying, “Hey, I don’t know if you are aware of the fact you work for Xerox.” And I said, “Oops.” I said no more paper because the idea is to quickly convert people from the previous approach. People are showing up, and they’re basically reading off the presentation. So we changed that. But organizational structure is the clearest way to inform you as to how successful you will be.


State of Cybersecurity 2018: Enterprises Can Do Better

It seems that over the past 12 months, security has slipped down the boardroom agenda. According to the survey results, only 20% of organizations have their security function reporting to the CEO or main board. This represents an even lower figure than the 24% from last year (although the question in the previous year was phrased slightly differently). Also, 57% of the practitioners surveyed believed that their main board was adequately supporting security initiatives, a 10% decrease from the 67% figure from the previous year. On the bright side, 64% of enterprises were expecting to increase their cybersecurity budget this year, which also means that in 36% of enterprises, the expectation is to make do with the same or less money on their security efforts. That is an improvement over last year (where only 50% of respondents expected a security budget increase) but still shows a degree of complacency or risk-optimism in a sizable number of organizations.


Rip and replace your RDBMS? No – build cloud apps instead.

man-changes
“Customer success” is not just a nice way to give a new name to services. It is very much a mindset and a model that says you have to really understand what your customer is trying to achieve." That advice also means architecture planning, tying DataStax into an array of tools and vendors, from the storage layer to the security layer to the middleware layer: “How we interact and engage with our partners is all very important.” So is this a revenue play for DataStax, or is it about solidifying the customer relationship and making sure the projects deliver? Bosworth says it’s very much the latter. Without opening the entire financial kimono, he offered this: "We don’t share a lot of financial information. One thing I can tell you is our gross margins run north of 75 percent – that’s our blended gross margin as a company. That’s really how you can figure out if a company is a services company or a software company. Certainly anything upwards of 70 percent puts you in the software category. … kind of time-to-impact if you will."


University of San Francisco GE Digital Transformation Case Study

“Improving the productivity of existing assets by even a single percentage point can generate significant benefits in the oil and gas sector (and in other sectors). “The average recovery rate of an oil well is 35%, meaning 65% of a well’s potential draw is left in the earth because available technology makes it too expensive,” explains Haynes-Gaspar. “If we can help raise that 35% to 36%, the world’s output will increase by 80 billion barrels — the equivalent of three years of global supply. The economic implications are huge.” GE bet big on the Industrial Internet. The company put sensors on all of their products including gas turbines, jet engines, and other machines; connecting them to the cloud; and analyzing the resulting flow of data. The goal: identify ways to improve machine productivity and reliability. And it didn’t take long for GE engineers to realize that they could find interesting and unique patterns in the data.


Car hackers find remotely exploitable vulnerabilities in Volkswagen and Audi vehicles

Car hackers find remotely exploitable vulnerabilities in VW, Audi cars
The researchers noted, “Based on our experience, it seems that cars which have been produced before are not automatically updated when being serviced at a dealer, thus are still vulnerable to the described attack.” I encourage you to read their research paper, which delves into their attack strategy and technical system details, but it does not fully disclose the details of the remotely exploitable vulnerability because that, they believe, would be “irresponsible.” The researchers said they want to protect future cars but ask, “What about the cars of today or cars that were shipped last week? They often don’t have the required capabilities (such as over-the-air updates) but will be on our roads for the next fifteen years. We believe they currently pose the real threat to their owners, having drive-by-wire technology in cars that are internet-connected without any way to reliably update the entire fleet at once.” The hacked car models were from 2015, so if you have an Audi or Volkswagen, then contact to your dealer and ask about a software update.


Collaboration with utilities seen as first step in growth of smart cities

Berst said cities can invest in becoming a smart city in small ways. From installing smart street lights to putting in solar rooftops and other distributed renewable energy sources, to providing residents with electric car charging stations, cities can not only provide a more environmentally friendly atmosphere, but also save money. Installing smart street lights, such as through the Urbanova initiative for example, can save a city millions in electricity costs. “Smart street lights have a pay-off of three years or less. It’s one of the lesser expensive on-ramps that can lead to a deeper collaboration,” Berst said. “While those trucks are there installing the LED street lights, why not have them snap in a communications network into that existing infrastructure while they are up there? Now, not only do you have smart street lights, but an entire communications network as well.”


A Quick Guide to Implementing ATDD


Collaboration is one of the core values of the Agile methodology. Once, as I was working on a large project, I noticed a lack of collaboration between developers, testers, and business-minded individuals; a lack of clarity in requirements; frequent requirements scope-creep; a lack of visibility in regards to the testing completed; and defects being identified late in the project lifecycle. Most importantly to me, though, was that no one had any idea about our automation framework, so all of the automation tests were written after the features were developed and ready for testing. ... As a result, I found Acceptance Test Driven Development (ATDD) as one of the approaches used to mitigate many of these issues. It is often used synonymously with Behavior Driven Development (BDD), Story Test Driven Development (SDD) and Specification By Example (SBE). The main distinction of ATDD, as opposed to other agile approaches, is its focus on making developers, testers, business people, product owners and other stakeholders collaborate as one unit and create a clear understanding of what needs to be implemented.


At Interop: Everyone Into the AI Pool

"Now is the time to proactively look for problems where you can apply this. Yes, I think it's that important," he said, adding that you could toss a dart at a company org chart and find an area that could benefit from AI. Helping to identify the problems to be solved, and the type of improvement -- be it a new product or service, or a process improvement -- that should result is where business leaders need to work with technologists and data scientists to match the goals with technology capabilities. Putting AI and machine learning into action is where David Karandish, founder and CEO of Ai Software, took over. There's been plenty of discussion about how to use intelligent assistants or agents in the corporate world, taking a step beyond the bots that have popped up on websites in recent years. Karandish introduced the audience to his company's "Jane", a chat-based assistant that answers questions for employees and customers when integrated with a client company's internal systems. It's in use at several client companies besides his own.



Quote for the day:


"Knowledge is like underwear. It is useful to have it, but not necessary to show it off." -- Bill Murray


Daily Tech Digest - May 01, 2018

Over a million vulnerable fiber routers can be easily hacked

fiber-router-hero.jpg
Over a million fiber routers can be remotely accessed, thanks to an authentication bypass bug that's easily exploited by modifying the URL in the browser's address bar. The bug lets anyone bypass the router's login page and access pages within -- simply by adding "?images/" to the end of the web address on any of the router's configuration pages, giving an attacker near complete access to the router. Because the ping and traceroute commands on the device's diagnostic page are running at "root" level, other commands can be remotely run on the device, too. The findings, published Monday, say the bug is found in routers used for fiber connections. These routers are central in bringing high-speed fiber internet to people's homes. At the time of writing, about 1.06 million routers marked were listed on Shodan, the search engine for unprotected devices and databases. Half the vulnerable routers are located on the Telmex network in Mexico, and the rest are found on in Kazakhstan and Vietnam.


Native-Like Animations for Page Transitions on the Web

If you’re unfamiliar with Nuxt and how to work with it to create Vue.js applications, there’s another article I wrote on the subject here. If you’re familiar with React and Next.js, Nuxt.js is the Vue equivalent. It offers server-side rendering, code splitting, and most importantly, hooks for page transitions. Even though the page transition hooks it offers are excellent, that’s not how we’re going to accomplish the bulk of our animations in this tutorial. In order to understand how the transitions we’re working with today do work, you’ll also need to have basic knowledge around the <transition /> component and the difference between CSS animations and transitions. I’ve covered both in more detail here. You’ll also need basic knowledge of the <transition-group />component and this Snipcart post is a great resource to learn more about it. Even though you’ll understand everything in more detail if you read these articles, I’ll give you the basic gist of what’s going on as we encounter things throughout the post.


GDPR: It’s A Marathon, Not A Sprint


The reality is that many companies will not be fully GDPR compliant by the required date. But it’s important to remember that GDPR is not an exhaustive list of what is and isn’t allowed; it’s a principle-based, legal framework to drive change, as opposed to a tick-box exercise. Those companies who purely view it as such will not be building the best platform to succeed in the future – and may even trip up along the way. With less than a month to go, we’ve pulled together some key learnings to help your business remain calm under pressure and show how keeping the right attitude and culture is crucial for true compliance. The main element to a positive GDPR journey is to remember that the regulation has been designed to better facilitate business across the digital market in Europe. Key to this is building trust with citizens and customers by clearly demonstrating that their rights are respected and their data is managed responsibly. It shouldn’t be looked at as another regulation as it essentially builds on data privacy and security principles which organisations should already be abiding by. 


Slack Releases Open Source SDL Tool

GoSDL is, he says, a fairly simple PHP application that allows any team member to begin the process of interacting with security. "The beginning of the process of a new feature is one where they can check whether they want direct security involvement," Feldman says. If so, the feature is flagged "high risk," not because of any actual risk but to make it high priority for security team action. If the security involvement box isn't checked, it doesn't mean that security steps aside, but their involvement begins with a series of questions about the impact on existing products and features. Once the security team is involved it begins to put together risk assessments (high, medium, or low) for each component of the feature. The product engineer or manager is responsible for a component survey with additional checklists of potential issues. All of the checklists and communications to this point are created in the PHP application running on the Slack platform. Once the lists reach the point of requiring action, the application generates a Jira ticket that creates the action item checklist.


Tapping AI to Counter Rising Ransomware Threat in Big Data Era

cybersecurity with AI and big data
Most hacks have their signature DNA. More than money, hackers are also driven by their ego to beat the system, so to speak. Cyber forensics typically reveals this signature. Automating the process will drastically reduce the time to sniff for these threats. The DNA is hard-coded into their system, which makes it almost impossible for hackers to change their signature mid-stream. This is, and has always, been their vulnerability. A good analogy would be the police and criminals. Unless investigators develop a predictive model to anticipate a crime before it happens, they will always be playing behind. The FBI’s Behavioral Analysis Unit was established precisely to find patterns on serial offenders in the hopes of identifying them through their signature, and finally pinning them down. Going back to Zero Day Live, the platform can be fully integrated into the IT or business enterprise with hardly any termination in the operations. Combing through large data, the tool is able to assess vulnerabilities and craft an extensive threat analysis.


10 Reasons Web Developers Should Learn Angular

There is no doubt that AngularJS – the self-proclaimed “superheroic JavaScript framework” – is gaining traction. I’ll refer to it frequently as just “Angular” in this post. I’ve had the privilege of working on an enterprise web application with a large team (almost 10 developers, soon growing to over 20) using Angular for over half of a year now. What’s even more interesting is that we started with a more traditional MVC/SPA approach using pure JavaScript and KnockoutJS before we switched over to using the power-packed combination of TypeScript and Angular. It’s important to note that we added comprehensive testing using Jasmine but overall the team agrees the combination of technologies has increased our quality and efficiency: we are seeing far fewer bugs and delivering features far more quickly. If you are familiar with Angular, this post may give you some ideas to think about that you hadn’t encountered before. If you know Angular and are trying to justify its adoption at your company or on your project, this post can provide you with some background information that may help.


The right way to manage devops configurations in the cloud

The right way to manage devops configurations in the cloud
An emerging best practice is to write your configurations with new code, change configurations with existing code, and couple those configurations directly to the code tree when sending it up the devops chain.  That way, the other tools and/or people can see the configuration bound to that particular code tree and database configuration without having to look for it in a configuration repository. This goes well beyond application configurations: Security configurations, governance configurations, compliance configurations, database configurations, and testing scripts also need to be coupled to the application code tree. You should do this as a best practice, so your workloads are logically and physically bounded so they are very easy to keep track of. You should do this no matter how few workloads you need to track or how simple your devops tool chain is. Trust me: Your workloads will grow and your tool chain will get more complex quickly. And if you don’t manage configurations the right way upfront, you’ll pay a very heavy price later in either inefficiencies and erros or in retrfittng your applications’ configurations to what they should have been all along.


Digital is a long-term objective, CEOs warned

The survey highlighted the importance of culture change, but only 37% of those surveyed believed that deep cultural change was needed in their company by 2020. Raskino said: “Digital business is colossal, changing fundamentally certain kinds of products and service. This does not happen overnight. It is a long haul. “If you remember the shift from WAP banking to app banking – this took eight years, and it was a relatively superficial change. But a deeper change to the product and services of your business can take 10 or more year – some will even take 15 years. The risk for business leaders is that some people believe you can do it in three years.” The challenge for business leaders is that investment in new business models and digital products changes the investor proposition, said Raskino. “Investor confidence is expressed through board governance. Often no one on the board of directors will have a tech background, so the group behaviour is not to be risk aggressive.” This risk-averse governance can hamper a CEO’s ability to drive a long-term fundamental shift in the business towards digital products and services, he said.


For a more secure world, cities must share information on cyber-attacks: experts

One solution to improve cyber-security resilience is for city officials to talk more openly about attacks they have endured, said Paul Argyle, who advises the mayor of Greater Manchester in Britain. “We need to accept it doesn’t necessarily mean you’ve done anything wrong if you’ve been attacked. We need to start sharing all that information,” he said. Manchester is striving to be recognized as a global digital ‘smart city’, and recently hosted a series of digital summits to push its reputation as Britain’s leading interconnected region. Encouraging tech start-ups, investing in digital research and introducing smart ticketing on public transport so that passengers can use one ticket to ride a bus, tram or bike are some of the measures being taken, Argyle told the Thomson Reuters Foundation. Hospitals in the city were last year affected by the ‘WannaCry’ ransomware attack that infected computers and crippled hospitals, banks and companies across the globe. Britain and the U.S. held North Korea responsible.


Security pros need to move beyond broken two-factor authentication

One of the simplest and most effective ways in which attackers can circumvent basic 2FA is via real-time phishing. With a real-time phishing attack, it is relatively easy for an attacker to coerce the user into giving up their username, password, and one-time-passcode, by asking them to log into a phishing website. The phishing website will look and feel and imitate the log-on experience of a “real” application. This is all with the intent of gaining unauthorized access to an organizations systems and data. Recently, FireEye released a real-time phishing tool - ReelPhish which they claim to have used successfully during their red team engagements. In fact, the FireEye article calls out that IBM Security Intelligence first reported on the use of real-time phishing in 2010. The research from the report concluded that 30% of attacks against websites that are using 2FA were being bypassed.



Quote for the day:


"Speak in such a way that others love to listen to u. Listen in such a way that others love to speak to u." -- Nicky Gumbel