Daily Tech Digest - August 17, 2018


Android 9 Pie is a massive, AI-infused software update, and it’s generally a pleasure to use. The handful of features made possible by machine learning are helpful additions, but there’s much more to Pie than that. Google has done a lot nipping and tucking to make Android itself easier to use, and some system-level changes give the platform room to grow in some important ways. Not everyone will love the changes Google made here -- power users in particular -- but overall, it’s a thoughtful, worthy update that will only get better when even more features arrive later this year. That's all because Google built Android 9 Pie to pay more attention to what you do, and to help out accordingly. That suggestion wasn't random: It was based on behavior that Android picked up on. Beyond that, there are plenty of important system-level improvements and a handful of tweaks that make Android more pleasant to use. There's still plenty of work to be done, but after using the update (in its various forms) for a while, I can confidently say Google's efforts are paying off.



Improving testing architecture when moving to microservices


Before Gamesys rethought its testing architecture, its software development was split across client-side and core development teams. The client team focused on front-end application development, while the core team focused on back-end services. However, these separate teams often managed the same swaths of code, which caused a variety of communication and testing challenges. The back-end application was a monolith. If someone on the client team sought to add functionality, such updates would have to integrate with services on the monolith. Bugs sometimes required fixes on the back end. "As the situation got more complex, the synchronization between the client and core teams would get sloppy," Borys said. The overlapping responsibility and complexity of a monolithic architecture led to release delays. Gamesys shifted to a microservices approach, which made it easier to decouple application functionality across the entire stack.


Increasing Security with a Service Mesh


In some cases, security is an afterthought, and we try to shoehorn it into our apps at the last possible moment. Why? Because doing security right is hard. For example, something foundational like "encrypting application traffic" should be commonplace and straightforward, right? Configuring TLS/HTTPS for our services should be straight forward, right? We may have even done it before on past projects. However, getting it right in my experience is not as easy as it sounds. Do we have the right certificates? Are they signed by the CA that the clients will accept? Are we enabling the right cipher suites? Did I import that into my truststore/keystore properly? Wouldn't it just be easy to enable the “--insecure” flag on my TLS/HTTPS configuration? Mis-configuring this type of thing can be extremely dangerous. Istio provides some help here. Istio deploys sidecar proxies (based on Envoy proxy) alongside each application instance that handles all of the network traffic for the application.


Australia appoints information and privacy commissioner

"Falk has extensive experience delivering the functions of independent regulators and a track record of working across Commonwealth and state agencies, business, and the community in law, policy, and education," Porter said on Friday. "The commissioner role is critical to helping ensure the privacy of Australians, particularly in the online environment." Australia's Notifiable Data Breaches scheme requires organisations covered by the Privacy Act 1988 to notify individuals whose personal information is involved in a data breach that is likely to result in "serious harm" as soon as practicable after becoming aware of a breach. During Falk's tenure as acting privacy commissioner, the Office of the Australian Information Commissioner (OAIC) opened an investigation into the Facebook-Cambridge Analytica improper use of data after revealing that more than 311,127 Australians were caught up in the scandal. The investigation, kicked off in April, will consider whether Facebook breached the Privacy Act. "All organisations that are covered by the Privacy Act have obligations in relation to the personal information that they hold," Falk said at the time.


IoT on the edge: revolutionary use cases create new privacy concerns


Emerging use cases for IoT and video are even more concerning. Retailers have been piloting systems that identify shoppers when they enter a store by sensing their mobile devices. Advancements are allowing retailers to use security camera footage to capture shoppers’ faces and identify individuals in real time. From here, retailers can identify a shopper’s age, gender, ethnicity and other visual attributes; track their movements, actions and sentiments throughout the store; and create and compile shopping profiles based on this and other data. In another innovative use case, artificial intelligence-assisted video-monitoring software can simultaneously process footage from dozens of security cameras and identify abnormal and potentially criminal or dangerous situations, including theft or violence. Using the technology, one university in Australia has been able to thwart a variety of criminal events from bike theft to assault during if not before the crimes occur.


As CTO you are responsible for both engineering and product

To succeed, CTOs need to have a deep understanding of the market they operate in and a broad knowledge across lots of different technologies. “The ability to go very deep very quickly on any single technology,” is essential, said Weinberg. What’s the best way to understand the market? According to Weinberg, the best way is to surround yourself by people who can feed you key information and insights. “That way you can distill information into patterns and opportunities, and act upon them. In my case, the person I rely on for this is my co-founder and CEO.” ... This is slightly paradoxical, as engineers and product people are typically very different types of people. “You can reason with engineers using cold logic, and most of the time there is quantifiable right or wrong answer,” said Weinberg. “With product design, there is typically no strict right or wrong answer. Everything needs to be debated subjectively, alternate theories need to be compared and decisions often need to be made based on intuition.”


How to protect your privacy in Windows 10

face superimposed on keyboard privacy hacker
You can turn that advertising ID off if you want. Launch the Windows 10 Settings app (by clicking on the Start button at the lower left corner of your screen and then clicking the Settings icon, which looks like a gear) and go to Privacy > General. There you'll see a list of choices under the title "Change privacy options"; the first controls the advertising ID. Move the slider from On to Off. You'll still get ads delivered to you, but they'll be generic ones rather than targeted ones, and your interests won't be tracked. ... Wherever you go, Windows 10 knows you're there. Some people don't mind this, because it helps the operating system give you relevant information, such as your local weather, what restaurants are nearby and so on. But if you don't want Windows 10 to track your location, you can tell it to stop. ... You can turn it off on a user-by-user basis as well — so if you have several people with different accounts using the same device, they can each turn location tracking on or off. To turn location tracking on or off for any single account, sign into the account, head back to this same screen and, instead of clicking Change, go to the slider beneath the word "Location" and move it to On or Off.


The road to effective bot development in DevOps environments


The important thing to understand is that repository webhooks provide the means by which bots can interact with a source code repository and perform instructive validation as part of their response behavior. There's already a good deal of work to do with repository bot development. Probot provides a framework that simplifies bot writing by adding a wrapper around the GitHub webhook architecture. The framework makes programming easier. Kore.ai is a bot development tool that allows developers to create fairly powerful bots using a low-code, graphical interface. A Kore.ai bot can be integrated with Bitbucket, GitHub and GitLab. Tools such as these will make bot development a lot easier. However, just because you can make a bot does not necessarily mean you have a useful bot. Believe me, the world does not need another confusing session with a chatbot. One bot that does one thing really well is better than an army of bots that cause confusion and frustration.


Avoiding the reference data quagmire

A company can put master data management (MDM) in place and make a tremendous effort to ensure its data quality. It can take pains to develop good data governance procedures and invest in leading-edge analytics technologies to make the most of its data. But any errors in its reference data can undermine all of these initiatives. Reference data is a non-volatile and slow-moving subset of enterprise data. It is often standardized by external bodies and businesses generally use the same reference data throughout their operations. Examples include country codes, SIC codes, currencies and measurement units. Typically, companies make use of several applications that feature drop-down lists of reference data, and may have various forms that are prefilled with certain reference data. To keep this data synchronized, some companies rely on reference data management or RDM.


New Trickbot Variant Touts Stealthy Code-Injection Trick


Upon execution, the malware sleeps for 30 seconds to evade sandboxes (by calling Sleep(30000)), and it then decrypts a dynamic link library file (named “shellcode_main,” which contains instructions that other programs can call upon to do certain things), which is then mapped to a buffer. Like older samples of Trickbot, researchers observed this newest variant make use of a technique called process-hollowing for unpacking. Instead of injecting code into the host program, Trickbot unmaps — or hollows out — legitimate code from target’s memory, and then overwrites the memory space with a malicious executable. First, a suspended process is created by the malware using CreateProcessW, which is then used to obtain a handle and copy the handle to a buffer, essentially re-reading and re-mapping that handle for various functions. These functions include unmapping the original malware module, creating a section to write the malicious code onto, mapping out the hollowed process, and resuming the suspended process and starting execution.



Quote for the day:


"If you find a path with no obstacles, it probably doesn't lead anywhere." -- Frank A Clark


Daily Tech Digest - August 16, 2018

U.S. Treasury: Regulators should back off FinTech, allow innovation

edge computing budgets up spending fintech circuitry ben franklin
"Banks are very adept at innovating and experimenting with new products and services. The catch is the implementation of those products and services to ensure data privacy and security; it may take months or longer to prove data privacy and security efficacy," Steven D’Alfonso, a research director with IDC Financial Insight said. The federal agency, however, specifically identified a need to remove legal and regulatory uncertainties that hold back financial services companies and data aggregators from establishing data-sharing agreements that would effectively move firms away from screen-scraping customer data to more secure and efficient methods of data access. Today, many third-party data aggregators unable to access consumer data via APIs resort to the more arduous method of asking consumers to provide account login credentials (usernames and passwords) in order to use fintech apps. "Consumers may or may not appreciate that they are providing their credentials to a third-party, and not logging in directly to their financial services company," the report noted.


Are microservices about to revolutionize the Internet of Things?

Are microservices about to revolutionize the Internet of Things?
Individual edge IoT devices typically need to be extremely power efficient and resource efficient, with the smallest possible memory footprint and consuming minimal CPU cycles. Microservices promise to help make that possible. “Microservices in an edge IoT environment can also be reused by multiple applications that are running in a virtualized edge,” Ouissal explained. “Video surveillance systems and a facial recognition system running at the edge could both use the microservices on a video camera, for example.” Microservices also bring distinct security advantages to IoT and edge computing, Ouissal claimed. Microservices can be designed to minimize their attack surface by running only specific functions and running them only when needed, so fewer unused functions remain “live” and therefore attackable. Microservices can also provide a higher level of isolation for edge and IoT applications: In the camera function described above, hacking the video streaming microservice on one app would not affect other streaming services, the app, or any other system.


Companies may be fooling themselves that they are GDPR compliant

A closer look at the steps taken by many of these companies reveals a GDPR strategy that it is only skin deep and fails to identify, monitor or delete all of the Personally Identifiable Information (PII) data they have stored. Such a shallow approach presents significant risks, as these businesses may be oblivious to much of the PII data that they hold and would have difficulty finding and deleting it, if requested to do so. They would also be unable to provide the regulatory authorities with GDPR-mandated information about data implicated in a breach within 72 hours of its discover—another GDPR requirement. To address these risks, companies need a holistic strategy to manage their data—one that automates the process of profiling, indexing, discovering, monitoring, moving and deleting all of their data as necessary, even if it’s unstructured or perceived to be low-risk. This will significantly reduce their GDPR and other regulatory compliance risks, while simultaneously allowing them to make greater use of the data in ways that create business value.


Banks lead in digital era fraud detection


Banks have recognised the need to have an omni-channel view of the different interactions and do their fraud risk assessments across the various channels, he said, because not only are cyber fraudsters working across multiple channels, but so do ordinary consumers, starting something on a laptop, continuing it on the phone and perhaps completing it through a virtual assistant while travelling in a car. “This is true for all consumer-facing businesses that have different channels through with they interact with consumers, and they should follow the banks’ lead and adopt an omni-channel approach to doing their risk profiling and gain from the visibility you have in each of the channels,” said Cohen. “In enterprise security, we were talking about breaking down channels years ago, and now we are starting to talk about it in the context of fraud, so that fraud assessments are carried out in the light of what is going on across all the available channels of interaction, especially as interactions become increasingly through third parties.”


Over 9 out of 10 people are ready to take orders from robots

Perhaps organisations are not doing enough to prepare the workforce for AI. Almost all (90 percent) of HR leaders and over half of employees (51 percent) reported that they are concerned they will not be able to adjust to the rapid adoption of AI as part of their job, and are not empowered to address an emerging AI skill gap in their organization. Almost three quarters (72 percent) of HR leaders noted that their organization does not provide any form of AI training program. Other major barriers to AI adoption in the enterprise are: Cost (74 percent), failure of technology (69 percent), and security risks (56 percent). But a failure to adopt AI will have negative consequences too. Almost four out of five (79 percent) HR leaders and 60 percent of employees believe that it will impact their careers, colleagues, and overall organization Emily He, SVP of Human Capital Management Cloud Business Group at Oracle, said: "To help employees embrace AI, organizations should partner with their HR leaders to address the skill gap and focus their IT strategy on embedding simple and powerful AI innovations into existing business processes."


Network capacity planning in the age of unpredictable workloads

Data center interconnect
When an application or a component of a distributed application moves or scales up, it needs a new IP address and capacity to route traffic to that new address. Every decision around workload portability and elasticity generates traffic on the data center network and the cloud gateway(s) involved. A workload's address determines how workflows through it connect, which defines the pathways and where to focus network capacity plans. To plan realistic capacity requirements, formal network engineers dive into the complex math of the Erlang B formula, and if you are inclined to learn it, check out the older book James Martin's Systems Analysis for Data Transmission. However, there are also easier rules of thumb. As a connection congests, it increases the risk of delay and packet loss in a nonlinear fashion. This tenet contributes to network capacity planning fundamentals. Problems ramp up slowly until the network reaches about 50% utilization; issues rise rapidly after that threshold.


Data recovery do's and don'ts for IT teams

data-recovery
Many, but not all, modern backup applications perform bit-by-bit checks to ensure the data being read from primary storage does in fact match the data being written to backup storage. Add that to your backup software shopping list, Verma advised. "The other thing I've noticed the industry's heading over to in the last couple of years is away from this notion of always having to do a full restore," Verma said. For example, you could restore a single virtual machine or an individual database table, rather than a whole server or the entire database itself. That's easy to forget when under pressure from angry users who want their data back immediately. "The days of doing a full recovery are gone, if you will. It's get me to what I need as quickly as possible," Verma said. "I would say that's the state of the industry." A virtual machine can actually be booted directly from its backup disk, which is useful for checking to see if the necessary data is there but not very realistic for a large-scale recovery, Verma added.


Web Application Security Thoughts

Web application security is a branch of Information Security that deals specifically with security of websites, web applications and web services. At a high level, Web application security draws on the principles of application security but applies them specifically to Internet and Web systems. Web application should follow a system for security testing. ... The company has to decide or project owner has to decide which remediation will take effective solution for the application. Because each application has different purpose and user groups. For financial application, you have to be more careful about the transaction and money stored in the database. So here, application and database both security is important. If we deal with card-data, then we must meet the PCI recommended guideline to mitigate or prevent fraud. In this case, owasp Top 10 vulnerabilities should be maintained properly. To protect the application now a days, not only you have to depend on only the application. You also have to depend on PCI recommended next generation firewall or waf. The next generation firewall will do the following things for an Web Application.


DDoS attackers increasingly strike outside of normal business hours

DDoS attacks outside business hours
While attack volumes increased, researchers recorded a 36% decrease in the overall number of attacks. There was a total of 9,325 attacks during the quarter: an average of 102 attacks per day. While the number of attacks decreased overall – possibly as a result of DDoS-as-a-service website Webstresser being closed down following an international police operation, both the scale and complexity of the attacks increased. The LSOC registered a 50% increase in hyper-scale attacks (80 Gbps+). The most complex attacks seen used 13 vectors in total. Link11’s Q2 DDoS Report revealed that threat actors targeted organisations most frequently between 4pm CET and midnight Saturday through to Monday, with businesses in the e-commerce, gaming, IT hosting, finance, and entertainment/media sectors being the most affected. The report reveals that high volume attacks were ramped up via Memcached reflection, SSDP reflection and CLDAP, with the peak attack bandwidth recorded at 156 Gbps.


AIOps platforms delve deeper into root cause analysis


The differences lie in the AIOps platforms' deployment architectures and infrastructure focus, said Nancy Gohring, an analyst with 451 Research who specializes in IT monitoring tools and wrote a white paper that analyzes FixStream's approach. "Dynatrace and AppDynamics use an agent on every host that collects app-level information, including code-level details," Gohring said. "FixStream uses data collectors that are deployed once per data center, which means they are more similar to network performance monitoring tools that offer insights into network, storage and compute instead of application performance." FixStream integrates with both Dynatrace and AppDynamics to join its infrastructure data to the APM data those vendors collect. Its strongest differentiation is in the way it digests all that data into easily readable reports for senior IT leaders, Gohring said. "It ties business processes and SLAs [service-level agreements] to the performance of both apps and infrastructure," she said.



Quote for the day:


"It is a terrible thing to look over your shoulder when you are trying to lead and find no one there." -- Franklin D. Roosevelt


Daily Tech Digest - August 15, 2018

Servitisation: What it is and how it can transform companies


It’s an idea that has certainly gained impetus, thanks largely to the growing capability of the internet of things (IoT). As we keep hearing, there will be approximately 20 billion IoT devices in place by 2020, and that level of connectivity is giving organisations an incredible amount of data on how products and services are being used, when they are performing well or when they are showing signs of malfunction, and so on. Such intelligence is worth something and, in field service circles, it’s already being used to provide predictive maintenance on products – a sort of weird science whereby the technicians can foresee problems and fix them before they happen. That’s the theory, anyway. ... For some, these figures may not be particularly enlightening – value-added services have always been good for business. But this is more than just bolting on a few services to products. This is also about using the product as a catalyst for offering customers something bigger and more ongoing. 



tree, root
Because bitcoin's ledger is public, it's obvious when someone uses one of these complex transactions. Taproot puts an end to that by making these transactions look the same as every other "boring payment," as Maxwell put it in the technology's announcement post. Yet, it can't do this without Schnorr, an upgrade to bitcoin's signature scheme that's been on developer's coding agenda for years. The signature scheme is supposed to be better than bitcoin's current signature scheme "in basically every way." And it enables Taproot because it allows signature data to be mashed together into one. "Schnorr is necessary for that because without it, we cannot encode multiple keys into a single key," Wuille continued in his presentation. ... But since developers have long been thinking about other enhancements, including those enabled by Schnorr, it's worth noting that Taproot isn't the only important change being considered. Towns thinks the privacy enhancement might be rolled in with a bunch of other upgrades going into bitcoin at the same time.


IoT Disruption Has Begun. And Retail Is Just the Start. When Will Your Industry Be Affected?

IoT Disruption Has Begun. And Retail Is Just the Start. When Will Your Industry Be Affected?
Recent research from Aruba found that 79 percent of retail companies surveyed said they will have internet of things (IoT) technology in their businesses by next year. This is good news for consumers, considering that Aruba also found that IoT improves the customer experience in 81 percent of cases examined. The IoT’s impact is also more immediate than that of other consumer tech. While virtual reality (VR) and artificial intelligence (AI) may provide in-depth knowledge and perspective, their technologies are at least one step removed from what someone is actually experiencing in the real world. IoT connectivity, in contrast, allows customers to enhance their experiences in real time, making it the most direct way for businesses to improve their customer experience. ... If an innovation costs too much to implement, the increased cost will offset the improvements a company has made in the customer experience.


The Sensors That Power Smart Cities Are A Hacker's Dream


"There are people out there who aren’t white hat hackers who have had at least one of these exploits for ages and who knows what they’ve done with it," Savage says. "This is something that people have been looking into." Industrial control hacking has recently become a major focus of nation state attackers, for instance, with Russia taking the most prolific known interest. State-sponsored Russian hackers have probed US grid and election infrastructure and have wreaked havoc overseas, causing two blackouts in Ukraine and compromising payment systems in the country with malware campaigns. As the risk grows worldwide, US officials have increasingly acknowledged the vulnerability of US infrastructure, and agencies like the Department of Homeland Security are scrambling to implement systemic safeguards. Cities will continue to invest in smart technology. Hopefully as they do, they'll appreciate that more data will often mean more risks—and that those vulnerabilities aren't always easy to fix.


5 Trends For The Truly ‘Intelligent,’ Data-Driven Business


Opportunities to push intelligence into the physical world go way beyond smart toasters, fridges, or door locks. It’s not just the Internet of Things but the “Internet of Thinking.” The next generation of intelligent systems will transform industries, using techniques including collecting sensor data from distributed utility grids, scheduling predictive maintenance of industrial equipment, monitoring workforce safety, supporting healthcare patient engagement and research, and creating new business models for insurance. All these initiatives rely on the efficient storage, movement, and analysis of data. Global revenue for big data and analytics practices in 2017 were $151 billion, up nearly 12% from the year before. And companies around the world are betting big on advances in data-hungry technologies, with AI investments in 2017 at more than $12.5 billion and IoT at more than $800 billion, according to IDC. These investments will result in more than fast and efficient businesses, concludes Accenture.


How smart contact lenses will help keep an eye on your health

lens-sensor.jpg
One of the main problems with such diabetic contact lenses is wearability. Creating a smart contact lens means putting rigid, non-see-through components onto a lens that will be in direct contact with the eye. The eye might be clever, but it's also sensitive: whatever touches it over a long period needs to be gentle enough not to damage the delicate cornea, while not obstructing the vision of the wearer. It's no mean feat, but the Purdue University researchers have come up with a new approach that they hope will open the doors to the long-delayed commercialisation of smart contact lenses. Using a new process called interfacial debonding, Lee's team were able to separate thin-film electronics from their own wafer substrate, and then print them onto another material -- in this case, contact lenses. The glucose sensors that are attached to the contact lens are covered by a "thin layer of transparent, biocompatible, and breathable polymer", according to Lee, meaning the lens won't irritate the eye or interfere with the normal vision of the wearer.


Why Even AI-Powered Factories Will Have Jobs for Humans

aug18_8_112792859
Just as the internet revolution ushered in completely novel jobs — for example, web designer and search-engine optimization engineer — so will the new era of AI. Telsa, for instance, is recruiting robot engineers, computer vision scientists, deep learning scientists, and machine learning systems engineers. And the company has also posted job listings for more-esoteric AI specialties such as a battery algorithms engineer and a sensor-fusion object tracking and prediction engineer. For the former position, the requirements go beyond knowledge of lithium-ion cells (cell capacity, impedance, energy, and so on) to include expertise to develop algorithms for state-of-the-art feedback control and estimation. Moreover, it’s not just technology-related jobs that are being reimagined with AI. In fact, as Tesla and other companies have discovered, AI technologies are having a profound impact throughout the enterprise, from sales and marketing, to R&D, to back-office functions like accounting and finance.


Cisco Patches Its Operating Systems Against New IKE Crypto Attack

Cisco logo
According to Cisco, this flaw "could allow an unauthenticated, remote attacker to obtain the encrypted nonces of an Internet Key Exchange Version 1 (IKEv1) session." "The vulnerability exists because the affected software responds incorrectly to decryption failures. An attacker could exploit this vulnerability sending crafted ciphertexts to a device configured with IKEv1 that uses RSA-encrypted nonces," Cisco said in a security advisory. "With our attack you can do an active online attack. But it is impossible to recover data from an already established IPsec session with our approach," Martin Grothe, one of the researchers behind this new attack has told Bleeping Computer via email. "IKE runs before IPsec, thus you can only attack the first Phase of IKE and if you succeed you are able to impersonate another IPsec endpoint or be an active man-in-the middle and read/write data to that session," he added. With this in mind, applying the Cisco patches is highly recommended. Clavister, Huawei, and ZyXELL have also released security advisories here, here, and here, respectively.


Mimecast extends core email security to enable cyber resilience


Not all organisations appreciate the importance of email security, according to Bauer. “The more advanced and mature enterprise security teams understand how wide open an attack vector email is because of all the opportunities it presents to attackers, including malicious attachments and links, email compromise attacks and a range of social engineering attacks. “The most sophisticated security teams are looking for the best technologies and they know how to evaluate those technologies. But at the opposite end of the spectrum, there are those businesses who think the email threat is limited to spam and is not that serious, perhaps because they have not yet had a serious incident or they have seen some attacks, but think it is just ‘bad luck’ and do not really address the problem. “But this is something that will not go away and can be extremely costly to targeted organisations, so organisations that have not done so already should pay attention to shoring up their defences against email threats.”


What’s the role of the CTO in digital transformation?

What̢۪s the role of the CTO in digital transformation? image
A CTO needs to take on the role of the ‘bridge builder’ between the strictly technical components of a transformation strategy and how they can apply to people and process in the specific context of an organisation. ... The CTO has specific technological insight and therefore needs to be directly involved in helping the entire organisation identify where technical systems are simply obsolete and not fit for purpose so as well as being a bridge builder, CTOs naturally lead the charge when dealing with a technology-led approach. They must be able to explain where the value is in the application of technological change in context – too often we see visions that are de-contextualised from the reality on the ground. De-contextualised technological planning does not allow for realistic strategic planning. With visions of the ambitious but feasible in sight it is then the whole leadership team’s task to decide what course they are going to map out and to work together on the digital transformation journey.



Quote for the day:


"Entrepreneurship is neither a science nor an art. It is a practice." -- Peter Drucker


Daily Tech Digest - August 14, 2018

Image: Shutterstock
Once your DevOps team understands the metrics that connect your work to customer satisfaction, look at all the tools your DevOps team uses to deliver for your success metrics, says Rosalind Radcliffe, a distinguished engineer at IBM. "Many of the tools that people adopt as part of their DevOps transformation help generate data that can help drive numbers into these metrics," says Radcliffe.  Knowing what to measure and how to measure is half the battle, but what's possibly more important is knowing who needs to raise their hand to keep the DevOps team on track.  “The glib answer,” says Mann, “is that everyone should care [about DevOps success metrics].” However, the more specific answer Mann gives is leaders of application development or a project management office (PMO) or project manager (PM) for application development at larger companies. If you’re at a smaller company, Mann says a scrum master could be in charge of evaluating a DevOps team’s value stream and success metrics.


Does Facebook even need a CSO?

Facebook / network connections / privacy / security / breach / wide-eyed fear
As sister publication CIO has reported, not every company needs a CSO. “It's not a person who needs a seat at the table,” Simple Tire CIO CJ Das said at a CSO event, “The topic needs a seat at the table.” Indeed, Flick says, “We expect to be judged on what we do to protect people's security, not whether we have someone with a certain title.” Pinterest and Tumblr don’t have CSOs. ... “The creation -- and I guess I would say appointment of any leadership role,” Coates notes, “also tells a story to the public, intentional or otherwise. In some regards, a chief security officer is also a central point to inspire confidence that this is something where they're putting a senior role at the table to tackle this issue.” That’s why, he says, New York State mandates all financial institutions have a CSO. “Because there is a challenge in security when the work is distributed amongst teams without a central owner, you can lose some of that experience and central visibility that a security organization brings to the table,” Coates explains, “A commitment to a high placed individual shows the company has kind of matured to that level and is thinking of it that way.”


Why you should shift from systems thinking to capability thinking

ipopbaistock-653884384.jpg
At a basic level, a capability is nothing more than an ability to perform a function. This capability could be unique in the industry, or relatively rudimentary, and in some cases a seemingly simple capability requires all manner of supporting capabilities to successfully realize it. For example, Amazon has a capability for same-day package delivery in most areas, a feature that's easy to articulate and understand, but it provides a significant advantage over competitors that don't have the logistical prowess to match Amazon. Generally speaking, most IT projects and policies start with a capability in mind. That capability might range from an ability to close financial reporting within a certain time frame to providing low-cost voice communications across geographies. However, most IT shops abandon this capability mindset once they select a suite of tools designed to deliver that capability. Suddenly, IT shifts from "providing a supply chain capability" to "we're an Oracle shop." Think of this like a carpenter going from saying "I'm a cabinet maker" to "I'm a Stanley 15-106A coping saw shop."


Why Choosing Python For Data Science Is An Important Move

data science and python
Python has also become ubiquitous on the web powering numerous high-profile websites with Web development frameworks like Django, Tornado, and TurboGears. More recently, there are signs that Python is also making its way into the field of cloud services with all major cloud providers including it in some capacity in their offerings. Python obviously has a bright future in the field of data science, especially when used in conjunction with powerful tools such as Jupyter Notebooks, which have become very popular in the data scientist community. The value proposition of Notebooks is that they are very easy to create and perfect for quickly running experiments. In addition, Notebooks support multiple high-fidelity serialization formats that can capture instructions, code, and results, which can then very easily be shared with other data scientists on the team or as open source for everyone to use. For example, we’re seeing an explosion of Jupyter Notebooks being shared on GitHub with more than 2.5 Million and counting.


Malicious Cryptocurrency Mining: The Road Ahead

Scammers are even resorting to API abuse to freeze their victims’ browsers with the primary target being Chrome, followed by Brave and Firefox. Fraudsters that operate by providing fake tech support services mainly depend on gaining control of easily-exploitable business functions rather than on specific tools. Apart from exploiting security flaws in Bitcoin transactions, scammers are capitalizing on the long patch lag time for tech support cases to make large profits. In January of this year, processors were greatly impacted by vulnerabilities namely, Meltdown and Spectre. While Meltdown was exclusively used against Intel processors, Spectre could attack almost all processors. These vulnerabilities can be used to access people’s login credentials, banking information, and personally identifiable information. Notably, Microsoft, Intel, and other vendors have implemented patches, but there are issues that may need to be addressed in the long run.


Smart cities face challenges and opportunities


One constant across all projects is data traffic. Although replicating projects is a challenge, data collection and traffic variation among various city pilot projects, compared to full-scale deployments, varies greatly. In a recent RootMetrics by IHS Markit test of internet of things (IoT) technologies in Las Vegas, even at a full-scale phase, the network exhibited significant problems. In fact, certain IoT networks were unable to provide enough coverage to support even the simplest smart city applications. Due to the ever-increasing volume of sensors and their data, robust connectivity technology is a requirement for success. It is also often limited by a city’s budget. According to RootMetrics, coverage and reliability across the entire city is the key to launching any successful smart city programme. Digital security is another threat cities face when they try to implement smart city projects. As personal data gets uploaded into the cloud, it is often shared with digital devices, which, in turn, share the information among multiple users.


The Impact Of Artificial Intelligence On The (Re)Insurance Sector

AI and data will take us into a world of ex-ante predictability and ex-post monitoring, which will change the way risks are observed, carried, realized and settled. This fundamental change will lead to profound changes to market balances and equilibriums in the (re)insurance sector, and will completely redefine the dynamics of the insurance market, on both the supply and the demand side. Although (re)insurers might have an early advantage given that they have greater means and more tools and, for the time being, more data, AI will transform both sides of the insurance transaction in the long run. All parties to the insurance ecosystem, be they risk carriers, brokers, or even customers, will use AI tools. It is even likely that negotiations and discussions could take place between the AI systems of the different parties to the insurance contract! This might appear futuristic, but the idea of two AI systems dueling with each other and trying to “fool” each other already underpins generative adversarial networks (GAN) and the concept of adverse learning, which are used in research for video recognition and analysis


Web security gets a boost as TLS gets major overhaul

CloudFlare implemented one of those early versions of TLS 1.3 on its servers in 2016, but by the end of 2017 found most traffic still relying on TLS 1.2. ... Since then however web giants like Facebook have enabled TLS 1.3. Google enabled it in Chrome 65 -- the latest version of Chrome is version 68 -- but at the time had only rolled out support on Gmail. Firefox-maker Mozilla also announced that TLS 1.3 draft 28 is already shipping in Firefox 61, noting that one of its biggest improvements on the security front is that it cuts out outdated cryptography in TLS 1.2 that made attacks like FREAK, POODLE, Logjam and others possible. It will ship the final version off TLS 1.3 in Firefox 63, due out in October. "Although the previous version, TLS 1.2, can be deployed securely, several high profile vulnerabilities have exploited optional parts of the protocol and outdated algorithms," IETF engineers said in a statement. "TLS 1.3 removes many of these problematic options and only includes support for algorithms with no known vulnerabilities."


Samsung Announces New SmartThings Mesh Wi-Fi System

Samsung Announces New SmartThings Mesh Wi-Fi System
SmartThings Wifi works as a SmartThings Hub to serve as the “brain” of the smart home. Samsung’s open SmartThings ecosystem makes it easy to automate and manage the smart home with one hub and the SmartThings app. Compatible with hundreds of third-party devices and services, SmartThings enables users to expand their smart home with lights, door locks, cameras, voice assistants, thermostats and more. SmartThings Wifi delivers a simple, convenient 2-in-1 solution, offering whole-home automation out of the box. For corner to corner coverage, users can choose the right Wi-Fi configuration to fit their needs, from a home with multiple levels to a small apartment. Each SmartThings Wifi router has a range of 1,500 square feet, with the 3-pack covering 4,500 square feet, and users can expand coverage by adding additional mesh routers to the set-up. It is easily set-up and managed by the Android and iOS compatible SmartThings app.


Five key security tips to avoid an IoT hack

While layered security must remain the key priority, it is essential to understand that generic networking equipment and IoT devices are the weak link. They often have no continuous update program for firmware and software, low lifetime support, and insufficient computational power to host an antivirus or any other security agents. As practice shows, they are almost always left alone without proper supervision at consumer homes, network perimeter of small & medium business offices or branches of huge corporations. It is crucial to keep up with the evolving threat landscape. To do that, companies need to move away from traditional security approaches to the next generation solutions, especially security controls that are driven by artificial intelligence. The latter are capable to precisely map a network and identify all devices (even those that might be left alone somewhere on the edge of the network).



Quote for the day:


"When you have to make a decision, you can't wait for tomorrow to make it. You've got to make it now, and then if it's wrong, maybe tomorrow you can make a better one." -- Harry S. Truman


Daily Tech Digest - August 13, 2018

Google DeepMind's AI can now detect over 50 sight-threatening eye conditions


In a project that began two years ago, DeepMind trained its machine learning algorithms using thousands of historic and fully anonymized eye scans to identify diseases that could lead to sight loss. According to the study, the system can now do so with 94 percent accuracy, and the hope is that it could eventually be used to transform how eye exams are conducted around the world. AI is taking on a number of roles within health care more widely. ... AI is also being used to help emergency call dispatchers in Europe detect heart attack situations. Diagnosing eye diseases from ocular scans is a complex and time-consuming for doctors. Also, an aging global population means eye disease is becoming more prevalent, increasing the burden on healthcare systems. That's providing the opportunity for AI to pitch in. "The number of eye scans we're performing is growing at a pace much faster than human experts are able to interpret them," said Pearse Keane, consultant ophthalmologist at Moorfields, in a statement. "There is a risk that this may cause delays in the diagnosis and treatment of sight-threatening diseases, which can be devastating for patients."



How Fintech Is Transforming Access to Finance

A percentage of the digital transactions that merchants receive are set aside to repay their advances. This arrangement keeps repayments fluid, bite-sized, and in line with cash flow. In India, Capital Float, a nonbank finance company, provides instant decisions on collateral-free loans for small entrepreneurs. A risk profile assessment is carried out in real time by analyzing MSMEs’ cash flows using data from Paytm, an e-commerce payment system and digital wallet company, mobile financial services firm Payworld, and smartphones. Capital Float customers carry out electronic know-your-customer authentication, receive the loan offer, confirm acceptance, and sign the loan agreement on a mobile app. The loan amount is credited to their account on the same day, with nil paperwork. Cash flow loans help MSMEs seize opportunities when they arise and are an excellent example of the targeted, niche innovation that enables fintech to compete with more prominent—but slower—traditional banks. They are well-suited to businesses that maintain very high margins, but lack enough hard assets to offer as collateral.


The Commercial HPC Storage Checklist – Item 3 – Protection at Scale


Many HPC storage solutions provide only replication for data protection. Replication protects against media failure within a node by creating two or three additional copies of data on other nodes in the storage cluster. The problem is a replication only model forces the organization to store two or three full additional copies of data. While replication does maintain performance during a failure, the level of exposure to an additional failure is enormous. Most enterprise storage systems support a single or dual parity protection scheme. While parity does not have the capacity waste of a replicated system, it can hurt storage performance if the design of the storage system cannot maintain performance during a failure/rebuild process. A Commercial HPC storage system needs to provide a parity-based protection scheme, so they do not waste capacity nor unnecessarily waste data center floor space. Because restarting of workloads is so time-consuming it also needs to have multiple layers of redundancy so that one or two drive failures don’t stop an HPC process from executing.


How artificial intelligence is shaping our future


A world fuelled and enhanced by AI is one to look forward to. Autonomous cars will mean efficient and safe transport. Real-time translation buds that will enable you to speak one language and hear another will transform our travel experiences. Despite the cries of alarmists, there is little reason to believe that our AIs are going to “wake up” and decide to do away with us. ... New drugs, therapies and treatments will produce a revolution in the delivery of healthcare. What’s true for health is true for education, leisure, finance and travel. Every aspect of how individuals, corporations and governments function can be more effectively managed with the right application of the right data. ... Humans will come to confide, trust and rely on our new companions. They will support us for better or worse, in our prime and our decline. Powered by AI and abundant data, they may assume the characteristics of those dear or near to you. Imagine your late grandmother or your favourite rock star chatting helpfully in your living room.


Microsoft may soon add multi-session remote access to Windows 10 Enterprise

windows server
At this point, multi-session Remote Desktop Services (RDS) is a Windows Server-only feature, one that lets users run applications hosted on servers, whether the servers are on-premises or cloud-based. But the evidence uncovered by Alhonen hints that Microsoft will expand a form of RDS to Windows 10. "There's a ton of unanswered questions," said Wes Miller, an analyst at Directions on Microsoft, noting Microsoft silence on such a move. He expected that some answers will be revealed at Microsoft Ignite, the company's massive conference for IT professionals that's set for Sept. 24-28, or with the release of Windows 10 1809 this fall. One thing he's sure of, however. "You won't see this running on hardware at a user's desktop," Miller said of Windows 10 Enterprise for Remote Sessions. Instead, he believes the SKU should be viewed as back-end infrastructure that will be installed at server farms in the virtual machines that populate those systems. If Windows Server serves - no pun intended - as the destination for remote sessions accessing applications or even desktops, why would Microsoft dilute the market with the presumably-less-expensive Windows 10 Enterprise SKU?


Will network management functions as a service arrive soon?

The cloud also eliminates the need for patching and upgrading software. Those functions would be handled by the vendor. In considering NMaaS, Laliberte said organizations should understand the underlying architectures, which in some cases could simply be individual licenses. "After that, it would come down to the cost model of Opex versus Capex, along with maintenance," he said. Laliberte said it is important to find out how the NMaaS offering charges and to determine the cost model and whether there any ingress charges for data collected. One of the other big issues, he added, is security. "If you are in a regulated industry or have sensitive information traversing your network and that data is being sent to the cloud, make sure to get the security team engaged and that they approve the model." NMaaS also enables the collection and dissemination of benchmarking data, which companies can use to determine how their networks compare to those of their peers. "It is a capability that could be very helpful for organizations to understand and to improve their own environment," Laliberte said.


Apcela optimizes Office 365 performance, improving user productivity

Apcela optimizes Office 365 performance
The architecture of the network Apcela has built follows the model of Network as a Service. It starts with a core network anchored on globally distributed carrier-neutral commercial data centers such as Equinix, which Apcela calls application hubs, or AppHUBs. These data centers are then connected with high capacity, low latency links. That high-performance core then interconnects to the network edge, which can be enterprise locations such as branches, manufacturing facilities, regional headquarters, data centers, and so on. This core network also interconnects with the cloud, connecting to the public internet, or directly peering with cloud data centers, such those operated by Microsoft where the vendor hosts Office 365.  ... A full security stack is also deployed to these commercial data centers. By distributing security and moving it out of the enterprise data center and into these distributed network nodes, a branch office simply goes to the nearest AppHUB to clear security there, and from there, it can go to the internet or to whatever SaaS applications these branches need to use, rather than having to go all the way back through the enterprise data center before they get out to the cloud.


8 guidelines to help ensure success with robotic process automation

The first step is to find out what really goes on day-to-day in your organization. It is very surprising how many variants of processing can build up. Use process mining, process discovery tools or consultants to figure out what you actually do in a process. Methods to do so might include extracting systems logs, or mouse clicks and keystrokes to find out how many ways an activity can happen and then eliminate the less optimal ways to automate the most common paths. Many different tools can be used to support automation, especially ones that have best practice processes already in them. Filtering to see if RPA should be used needs to start with understanding the process in order to then understand the choices of automation available in the short, medium and longer term. If people don’t know what they do or how they do it, they’re not ready to start with RPA. Standardized, repetitive, re-keying tasks of digital data is the optimal place to start thinking if RPA makes sense or not.


How Smaller Financial Services Firms Can Win With Open-Banking Disruption

financial services, disruption, open banking
Even though only nine of Europe’s largest banks are required to comply with PSD2, many small and midsize financial services – as well as their much-larger rivals – are warming to the idea of opening up their customers’ transactional data. Banks and insurers like the idea of using this information to propose more compelling lending options, credit lines, and investment services to their customers. But more importantly, their customers will have the power to dictate how their information is exchanged with other institutions to find the best way to manage their financial growth. According to Oxford Economics study “The Transformation Imperative for Small and Midsize Financial Services Firms,” sponsored by SAP, small and midsize banks and insurers seem to be on the right digital path towards open banking. Surveyed participants indicated that they are heavily investing in efficient, scalable, and connected technology that can help keep their data and systems more secure and support innovation. 


The Ethics of Security


The usual Black Mirror-style thought experiment (admittedly one not used in the 18th century) is to imagine you kindly drop by to visit a friend in hospital. On walking through the door, their new Benthamometer detects your healthy heart, lungs, liver and kidneys could save the lives of 5 sick people inside and your low social media friend count suggests few folk would miss you. Statistically, sacrificing you to save those 5 more popular patients is not only OK, it is morally imperative! It is the extreme edge cases of a utilitarian or statistical approach that are often the cause of algorithmic unfairness. If the target KPIs are met, then by definition the algorithm must be good, even if a few people do suffer a bit. No omelettes without some broken eggs! If you think this would never happen in reality, we only need to look at the use of algorithmically generated drone kill lists by the US government in the Yemen. Journalist and human rights lawyer Cori Crider has revealed that thousands of apparently innocent people have been killed by America’s utilitarian approach to acceptable civilian casualties in a country they are supposed to be helping.



Quote for the day:


"Leaders know the importance of having someone in their lives who will unfailingly and fearlessly tell them the truth." -- Warren G. Bennis


Daily Tech Digest - August 12, 2018

Honestly, you should already be incorporating purpose into your company culture, regardless of technology. But the fact remains that making time to define that purpose—just like making time to define your business goals—will go a long way toward creating a digital culture that isn’t just present, but palpable and valued by customers and employees alike. For instance, when your purpose changes from making quality shoes to providing customers with a memorable shoe-buying/show-wearing experience, your digital culture lens will shift automatically. Suddenly, you’re thinking of ways to use technology in new and exciting ways for the customer, rather than just getting the job done. These are the kinds of shifts that drive digital transformation—and create positive change. Is change easy? No. But when it comes to creating successful digital transformation, the cards are on the table, and they’re incredibly easy to read. Successful transformation requires cultural change. Period. Following the steps above to build a stronger digital culture—however challenging—will help get you there.


React Native JavaScript framework stumbles

Facebook found that initial principles of React Native—serving as a single asynchronous, serializable, and batched bridge between JavaScript and native apps—made it harder to build features. The asynchronous bridge, for example, has meant JavaScript logic could not be integrated directly with native APIs expecting asynchronous answers. And batched bridge-queuing native calls made it more difficult to have React Native apps call into functions implemented natively. ... Not everyone is waiting for Facebook to work out the kinks in React Native. Walmart Labs has built an open source platform, Electrode Native, for integrating React Native components into existing mobile applications. Running on Node.js 6 or later, Electrode Native lets developers select features to add to an application and packages them in a single library. Built-in dependency version control is included to control native dependencies for alignment to React Native components.


When Digital Innovation Becomes A Factory Of Business Outcomes

innovation, digital innovation, digital transformation, agilityTo claim a business and market leadership position, periodic or one-time innovation is not enough. Continuous innovation with the latest technologies is imperative to stay current with the processes and experiences that employees and customers demand. In some cases, global executives are committed to innovation but not satisfied with their innovation performance. This reality serves as an excellent reminder that a self-correcting structure, which includes measurement and the flexibility to change, is required to identify and implement innovations. Considering the low satisfaction among executives when it comes to innovation, a factory setup – which includes defined processes, connected systems, and one source of truth – emerges as the ideal self-correcting model. This factory approach inherently enables the repeatability, structure, measurability, root-cause visibility, continuous improvement, and cost optimization that businesses need to succeed.


The cost of a payment card data breach

The financial costs for breached organisations vary based on several factors. The size of the breach is the most important, but the affected payment channel, the number of servers and how those servers are interconnected also play an important role. There is also the cost of the breach response. Organisations will need to notify their regulator and affected parties. If they have planned for a breach, this will be less expensive than having to do it on the fly. The same is true if they intend to create a help page on their website and a phone line for customers to use to learn more about the incident. Organisations will probably also be required to cover the cost of a forensic investigation. A PCI Level 2 investigation will cost about £25,000–£50,000, and a Level 1 investigation will cost upwards of £100,000. Depending on the investigation’s findings, organisations might face tough disciplinary action. Fines for non-compliance are levied on the payment processers or card companies rather than the breached organisation.


How Payroll AI and Machine Learning Are Transforming Businesses

AI machine learning for payroll services
A number of problems surfaced even among companies that had felt like they had fully developed their payroll management solutions. One of the biggest problems was with tracking employee expenses. Only 33% and 21% of companies tracked domestic and global employee expenses, respectively. Many companies are just beginning to realize the imperfections in their payroll management processes. They are beginning to invest in new AI technology that can help them address these problems. ... Machine learning has helped payroll managers develop more efficient processes for handling these queries. One of the most common ways that they can improve interactivity and streamline customer service is by tracking the questions and responses between customers and payroll managers. After observing a pattern, they can help customer service representatives develop automated responses to the most frequent inquiries.


From finance to healthcare: 5 fintech trends that will benefit digital health products


While finance tracks money, the fitness industry is driving advances in biometric wearables that monitor and record steps and active minutes, heart rate, calories burned, even sleep. The relationship between tracking fitness and larger health issues has always been implicit, and the traction and success of personal performance products provides a bridge to more dedicated health technology. Today, the self-generated tracking and performance trend is driving a proliferation of health-related apps that encourage and empower customers to proactively engage with their wellness. From cardio trackers, calorie counters, ovulation and menstruation apps, and pregnancy trackers, developments in more sophisticated clinical solutions are leading to technology that combines with wearables and mobile apps to monitor chronic conditions, including cardiovascular disease, diabetes, stress and mental health. And with advances in biometric accuracy, data analysis, machine learning, and AI algorithms, products and applications that tap this high value data is forecast to proliferate.


IoT Innovation in Transportation

There has been continuous growth in the field of transportation over the past decade because of the crucial role of logistics in transport through air, water, or land. Managers or transportation heads have started searching for ways to improve profits and logistics by minimizing the costs associated with the project. While managers work on profits and ways to improve the transportation process, corporations have started to look towards big data being generated in this field and can point towards ways to improve transportation. IoT interpolates perfectly with transportation, and there is a huge scope where the logistics can be modified to gather data on each step throughout the transportation process, which will give the managers a clear idea of how transport can be further improved. Trucks or any other type of transport facility can now be connected through IoT, which can provide the location of the vehicle, movement, speed, and estimated time of delivery through the exchange of data. This was not at all possible previously, but, now, it is due to the introduction of IoT in this field.


What is continuous integration (CI): Faster, better software development

What is continuous integration (CI): Faster, better software development
Continuous integration is a development philosophy backed by process mechanics and software build automation. When practicing CI, developers commit their code into the version-control repository frequently and most teams have a minimal standard of committing code at least daily. The rationale behind this is that it’s easier to identify defects and other software quality issues on smaller code differentials rather than larger ones developed over extensive period of times. In addition, when developers work on shorter commit cycles, it is less likely for multiple developers to be editing the same code and requiring a merge when committing. Teams implementing continuous integration often start with version-control configuration and practice definitions. Even though checking in code is done frequently, features and fixes are implemented on both short and longer time frames. Development teams practicing continuous integration use different techniques to control what features and code is ready for production.


Race Condition vs. Data Race in Java

A race condition is a property of an algorithm (or a program, system, etc.) that is manifested in displaying anomalous outcomes or behavior because of the unfortunate ordering (or relative timing) of events. A data race is the property of an execution of a program. According to the Java Memory Model (JMM), an execution is said to contain a data race if it contains at least two conflicting accesses (reads of or writes to the same variable) that are not ordered by a happens-before (HB) relationship (two accesses to the same variable are said to be conflicting if at least one of the accesses is a write). This definition can probably be generalized by saying that an execution contains a data race if it contains at least two conflicting accesses that are not properly coordinated (a.k.a synchronized), but I am going to talk about data races as they are defined by the JMM. And, unfortunately, the above definition has a significant flaw. ... Despite the incorrect definition stated by JMM that remains unchanged, I am going to use a fixed version.


Multiple Solutions Hardens Posture but Creates Agent and Alert Fatigue

CISOs agree that prevention is faulty, but investigation is a burden. EDR capabilities can provide improved detection and response approaches to prolific security incidents, and using automation can help to address the global shortage of cybersecurity professionals. Specifically, EDR tools best fit resource-strapped businesses with lean IT teams that operate without a Security Operation Center (SOC). However, half of IT executives worldwide said that managing EDR tools is difficult or very difficult. In both the US and UK, 49 percent of all endpoint alerts triggered by monitoring and response techniques turned out to be false alarms. Sixty-four percent of Americans in companies with no SOC said monitoring activities are one of their toughest challenges. Spotting an ongoing breach also means fighting alert fatigue caused by noisy traditional security solutions. It’s a race against time when filtering security alerts, which can be especially difficult if the organization is understaffed and overburdened.



Quote for the day:

"Tact is the ability to tell someone to go to hell in such a way that they look forward to the trip." -- Winston S. Churchill