Daily Tech Digest - July 06, 2019


There is still a long way to go before PR achieves artificial intelligence nirvana, Graham explains. Talking about the Alexa/ Google home devices and autonomous cars, she continues: “These are employing machine learning to improve performance by analyzing and incorporating the data that they receive, but they aren’t exactly flawless and they employ many thousands of data scientists (along with millions of every day users) to inform and refine the technology.” In today’s competitive market, communicators must continue investing in AI. However, Rausch advises organizations to stay safe by remaining operational without depending on artificial intelligence promises and by taking advantage of how technology has proven itself to empower the PR process. Computers can act very decisively within seconds when they find evidence that something is happening. This shouldn’t mean that we can fully trust automated insights. “Hesitation is a very human thing to do. Computers don’t hesitate…they are absolutely literal,” explains Rausch.



Beyond Limits: Rethinking the next generation of AI

A human profile containing digital wireframe of technology connections.
Beyond Limits evolved out of work with NASA's Jet Propulsion Laboratory (JPL) for remote rovers used to explore places like the moon and Mars. Due to the communications lag in space, real-time control is virtually impossible. Any AI solution must be not only fully autonomous, it must be able to train and, ideally, correct itself. When there is a problem it can’t correct, the bandwidth limitations for communication make full reprograming problematic…but point patches are certainly possible. This resulted in an AI platform uniquely able to be updated, modified and, to a certain and initially limited extent, able to both teach itself and make corrections while disconnected. This unusual requirement likely has made the resulting AI nearly ideal for areas where the AI must often act independent of oversight – and/or in areas where problems can escalate very rapidly – and the AI must be able to both deal with a diversity of known and unknown issues.


Transforming Organisations, Changing The Nature Of Work

Transforming organisation
In the Fourth Industrial Revolution the biggest challenge for every business will be ‘speed to capability’ In other words, how quickly can a company retool itself, both in terms of technology and skills, in order to perceive, analyse, understand, and respond to continuously changing customer behaviour and expectations? Cloud technologies can provide retooling agility, but that is not enough. Companies will need to reorganise work in order to obtain human agility as well. They need to be able to access and deploy a wide range of skills quickly and on demand. This means that we must forget the concept of a ‘job’. This concept is a relic of the First Industrial Revolution where stability was critical for business success, and people were deployed in stable organisational units. ... Human workers will be defined by their skills and not by job titles. In such a world, leadership needs to radically change too. Instead of a supervisory role that ensures processes are dutifully followed by all, the new leaders should be more like orchestra conductors: bringing together diverse talent and technology into a coherent whole that can deliver an excellent performance whatever score you put in front of them.


Experts Discuss Data Science and Machine Learning Best Practices

Surviving and thriving with data science and machine learning means not only having the right platforms, tools and skills, but identifying use cases and implementing processes that can deliver repeatable, scalable business value. The challenges are numerous, from selecting data sets and data platforms, to architecting and optimizing data pipelines, and model training and deployment. In response, new solutions have emerged to deliver key capabilities in areas including visualization, self-service and real-time analytics. Along with the rise of DataOps, greater collaboration and automation have been identified as key success factors. DBTA recently held a webinar with Bethann Noble, director of product marketing, machine learning, Cloudera; Gaurav Deshpande, VP of marketing, TigerGraph; and Will Davis, senior director of product marketing, Trifacta, who discussed new technologies and strategies for expanding data science and machine learning capabilities.


Visa says payment industry can move away from using passwords


For security-minded individuals, mobile device manufacturers have addressed concerns about stolen biometric information by storing and encrypting biometric templates — algorithmic representations instead of actual biometric attributes — locally on consumer-owned devices instead of the cloud. This ensures an individual is always in possession of their personal biometric data with the option to delete the data at any time. In addition, authentication accuracy is bolstered by liveness detection used by biometric scanners and software that can identify if a fingerprint is copied or a facial scan is of a mask. It’s been roughly six years since fingerprint sensors were integrated into consumer smartphones and in this short amount of time, consumers have grown increasingly comfortable with the approach. The need for quick and easy authentication will only increase with the growth of digital products and services, and remembering unique passwords for every internet-connected device or app is untenable.


How to Effectively Lead Remote IT teams

The best measure of team collaboration is how many times team members interact. If you are not at the same office, make sure you design special meetings for interactions. Encourage people to do it even online. For example, tell George to call Stefan, because he has something interesting to share! Or ask your remote team to go together for lunch and give them a topic to talk about. It is very inspiring if the topic is not about work, rather than something of a higher value, such as how can we stop poverty or why did we end up with the situation in Syria. The last one goes to live interaction and having shared fun experiences. If you doubt the investment of getting people at the same place, I am personally investing in inviting our partners in joining us during our team building events. Each year we celebrate our company anniversary at the seaside – this year it will be our 13 th and we will celebrate at one of the best Black Sea resorts – if you are one of our partners just reach Burgas on 13 July – the rest is on us!


Five Things to Understand About Digital Transformation

Five Things to Understand About Digital Transformation
Today, leaders are talking about 3D printing, Internet of Things (IoT), Robotics and similar advancements in digital technology, which can drastically impact organizations and industries. Many leaders, however, are missing a few key points, resulting in a failure to leverage the power of digital in their business, and becoming irrelevant instead: Digitalization is not just a technology trend. It is an overarching business transformation driven by a shift in the organizational mindset;  Digitalization is not characterized by creating mobile apps and having a social media presence. It is an entirely new approach to business. Digitalization is not the same as digitization” - Digitization - the process of converting the physical and analogue into something that’s virtual and digital - Digitalization - leveraging technology to create an exceptional customer experience, become agile and unlock new value. We are in the Era of the Digital BLUR. Organizations leveraging the power of digital are playing by very different rules and are attacking the incumbents from practically every industry. How?


Scrum Team: What Is Your Inner Compass?

Where's your team's compass?
If you can create a vision for your Scrum team, and you find them all aligned in the right direction, your work has a reasonably good chance at success. Have you ever been part of a Scrum team that struggled hard making any progress? Team members had plenty of talent, required resources, and opportunities, but they just couldn't progress enough and create impact.  If the above plot sounds familiar to you, there's a strong possibility that you might find reading this article valuable. Great vision precedes success, and a compelling vision provides the right direction to the team. If you are unaware of the team's vision, you can't act with conviction. If you haven't inspected the vision in light of your purpose, you can't even be sure that the team you are on is the appropriate one for you. If the team members have an agenda of working against each other, the team's spirit and drive gets lost. Conversely, a team that embraces a vision is more focused, energized, and committed. It knows the reason for its existence. So, how do you inspect a team vision? How do you know whether it is worthy and compelling enough to drive people?


Automated Peril: Researchers Hack 'Smart Home' Hubs

Automated Peril: Researchers Hack 'Smart Home' Hubs
They managed to retrieve a hardcoded SSH private key (CVE-2019-9560) in the controllers. By removing and then imaging an SD card from the controller, they were able to extract the private key, which was needed for root access. The private key was stored in a password-protected folder called /etc/dropbear and named dropbear_rsa_host_key. But they were still able to extract it despite it being password-protected. That SSH key isn't unique, either, so it could be reused across other controllers. In a short video, they show how it is possible to use the vulnerabilities to unlock a Yale lock linked to the controller. The researchers also discovered a local API authentication problem. They found a SHA1 password hash, and because the controller uses the pass-the-hash method rather than requiring the credentials to be input, they were able to construct a working authentication request. After that, they say it would be possible to send an authenticated request to unlock a lock.


What CIOs and CTOs Can Learn From Smart Cities

Image: Sergey Nivens - stock.adobe.com
"One of the hard things is determining the problems you may not know about. One of those things is one-way streets," said Sherwood. "We've been working with NTT to count the number of drivers, and based on historic data and analytics we can start predicting when we might have a wrong way driver. We're able to count the number of vehicles to determine whether we need to invest more in signage. change the road layout, [or otherwise] solve the problem." Las Vegas is also using video analytics to improve traffic flow through dynamic signal timing. It's also counting pedestrian traffic and monitoring environmental factors, all for the purpose of better decision-making. ... "We have a variety of projects we're working on that focus on six key areas: public safety, education for workforce development and to help our population prepare for the future, economic development, health and wellness, social aspirations to try to close the digital divide, and mobility which focuses on how we move people around the city more efficiently."




Quote for the day:


"Leaders are people who believe so passionately that they can seduce other people into sharing their dream." -- Warren G. Bennis


Daily Tech Digest - July 05, 2019

Are Programming Languages Key to the Evolution of Machine Learning?

Person coding on a computer.png
We are at the stage where the data, compute and deep learning algorithms that are absolutely necessary to make AI a reality have all become abundant,” Tutuk said. “But just like the early days of computer technology, the use of state-of-the-art AI is locked out of the reach of millions of developers. At present, even the most popular deep learning frameworks all require a great level of expertise.” There was a similar trend in the early days of developing computer software, she said, and programming languages like Fortran, C, and C++ were easier to use than assembly languages but were still largely inaccessible to most. It was the development of high-level programming languages like Java, Python, and PHP that made computer programming much more widely accessible, around the world. “Without these high-level abstractions, the digital world as we know it today will not exist,” Tutuk said.



These are the top skill sets for a successful blockchain team

certification education knowledge learning silhouette with graduation cap with abstract technology
Those entering the blockchain development/engineering field should have the mentality of a hacker - or the ability to problem solve collaboratively in a workshop setting when a client presents a business problem. They need to be able to think through the business objectives, implications and value "for each of the participants and then [define] the architecture and overall solution flow," KPMG said. "It is this collaborative approach that leads to a successful application of blockchain." Given the lack of coursework around blockchain and its relatively new existence in the enterprise, a team must be open to exploring and experimenting by "hacking the problem" from a business and IT perspective, according to KPMG. "I'd say at KPMG we've been very successful at taking [employee] skills in-house and upscaling them to deliver blockchain skills," Keele said. "Until universities start printing blockchain degrees, that will be the pattern that will continue."


Security and privacy key to smart buildings and cities


One of the biggest challenges is the huge number and variety of stakeholders who all have a role to play and need to work in collaboration. These include building owners, property developers, landlords, building occupants, architects, technology suppliers, building services engineers, town planners, chief security officers, chief information security officers, data protection officers and more. At the core of the security problem is the fact that many of the systems that smart building and cities will need to rely on will be linked to a wide variety of internet of things-connected (IoT) devices and sensors that are potentially vulnerable to cyber attacks. The whitepaper underlines the importance of considering and evaluating cyber security throughout the whole supply chain to protect data, maintain privacy and keep risk associated with cyber threats to a minimum. According to the whitepaper, this process should always start by looking at device security and the supplier’s cyber maturity.


How Developers Can Learn the Language of Business Stakeholders

Business stakeholders are not your enemies; once they have enough sound information on where we are and what we expect to happen shortly, they are willing to accept reasonable requests or decisions - even additional learning time which team members may require.  What you can do is approach the opportunities for learning using the learning curve effect. Although this is something we should not avoid, I often see this element being skipped over when planning a project or forecasting work. We tend to translate the metrics and statistics from the stable state to the initial phases, when the team is still formulating. Similarly, this applies to the new person in a role, who needs to learn not only her place in the project, but very often new responsibilities. Let’s keep this in mind when planning work or identifying impediments. You can read more on the learning curve effect in a separate article I wrote on the topic, Never stop learning – why is learning curve effect so powerful?


Enterprise architect role is more about business than ever


"In the past that's been somewhat separated," Nelson said. "There might have been dedicated business architects running around that live on the business side that may or may not interact with EA." He said that in a similar vein, CIOs are now frequently called to the overall business strategy table, and that trend is dragging all of IT -- particularly enterprise architects -- in the same direction. But these organizations need more than just a general IT liaison. Now that businesses put so much value on their digital strategy, they need constant input from architects that possess an intimate understanding of their software capabilities and can shape development practices to meet specific business needs. Aslinn Merriman, emerging technology architect at Sargento, Inc., a large food production company based in Plymouth, Wisconsin, agreed that the architect's purpose is to help set a strategy and facilitate development goals that align with other business units and the overall organization.


US Cyber Command Warns of Outlook Vulnerability Exploits

While the warning from Cyber Command did not offer many details, some security researchers, including analysts with Chronicle - the cybersecurity arm of Alphabet - suspect that this latest attack is related to the activity of an advanced persistent threat group known as APT33, which also goes by the name Shamoon. In research that FireEye published in 2017, analysts found that APT33 has possible ties to Iranian intelligence and has previously targeted aerospace and energy firms in the Middle East. Over the last two weeks, the U.S. Department of Homeland Security's Cybersecurity and Infrastructure Agency has warned about an increase in Iranian espionage and cyber activity, including increasing use of so-called "wiper" attacks that render computers unusable. One the largest wiper attacks ever recorded targeted the oil giant Saudi Aramco in 2012. In that case, the attackers used malware also called Shamoon, which has appeared in other attacks over the course of the last several years



Facebook open-sources DLRM, a deep learning recommendation model  

A Facebook like button is pictured at the Facebook's France headquarters in Paris, France, November 27, 2017.
Facebook AI Research (FAIR) open-sources a lot of its work, but its parent company is making DLRM available for free to help the wider AI community address challenges presented by recommendation engines, like a need for neural networks to associate categorical data with certain higher-level attributes. “Although recommendation and personalization systems still drive much practical success of deep learning within industry today, these networks continue to receive little attention in the academic community,” the paper reads. “By providing a detailed description of a state-of-the-art recommendation system and its open-source implementation, we hope to draw attention to the unique challenges that this class of networks present in an accessible way for the purpose of further algorithmic experimentation, modeling, system co-design, and benchmarking.” The makers of DLRM suggest the model be used for benchmarking the speed and accuracy performance of recommendation engines. The DLRM benchmark for experimentation and performance evaluation is written in Python and supports random and synthetic inputs.


Google debuts Deep Learning Containers in beta

deep-learning-containers
The service, called Deep Learning Containers, can be run both in the cloud or on-premises. It consists of numerous performance-optimized Docker containers that come packaged with various tools necessary to run deep learning algorithms. Those tools include preconfigured Jupyter Notebooks, which are interactive tools used to work with and share code, equations, visualizations and text, and Google Kubernetes Engine clusters, which are used to orchestrate multiple container deployments. The service also provides machine learning acceleration capabilities with Nvidia Corp.’s graphics processing units and Intel Corp.’s central processing units. Nvidia’s CUDA, cuDNN and NCCL machine learning libraries are also thrown in. In a blog post Wednesday, Google software engineer Mike Cheng explained that Deep Learning Containers are designed to provide all of the necessary dependencies needed to get applications up and running in the fastest possible time. The service also integrates with various Google Cloud services, such as BigQuery for analytics, Cloud DataProc for Apache Hadoop and Apache Spark, and Cloud Dataflow for batch processing and streaming data using Apache Beam.


Implementing IoT – overcoming barriers to commercial adoption


The basic architecture of IoT comprises four domains: the sensors, the connectivity of those sensors, the data hub that enables the data from all sorts of sensors to be interoperable (rather than stuck in silos), and the applications. The data hub plays a vital role in presenting the data to the applications in a uniform way, and Davies highlighted the work being done at CityVerve, a smart city demonstrator in Manchester encompassing a smart cycle light trial to understand cycle usage and improve cycle routes, an air quality trail which is linked to traffic density, and a water usage trial for leak management and demand management. Edge computing will play an important role in reducing connectivity demands, and zero-touch device management will be essential. Stuart Higgins, head of smart cities and IoT at Cisco, talked about some of the IoT trials and commercial deployments in the UK and worldwide. Many companies are digitising – seeing their operations and products as data to be managed in an IoT context.


3 serverless development strategies for stateful applications

Functions should be directly accessible to each other. Without immediate connections, functions depend on a slow storage medium to transport data from one function to another, building up latency. In real-time application scenarios -- such as 24/7 monitoring systems -- latency is unacceptable. Serverless functions predominantly underpin short-term workloads, which means that resources are allocated to them when requested and taken away once the request ends. Stateful applications developed on serverless functions can't use traditional mechanisms to work, such as global variables that can hold data throughout the application's lifetime. It's impossible for stateless functions to read from and write to disk, and the application can't maintain a constant connection to the database. To create stateful applications, serverless developers can manage application state with database connections, an event payload or backend as a service (BaaS) to integrate with the application.



Quote for the day:


"A leader is best when people barely know he exists, when his work is done, his aim fulfilled, they will say: we did it ourselves." -- Laotzu


Daily Tech Digest - July 04, 2019

Security flaws in a popular smart home hub let hackers unlock front doors

smart lock hero 1
The researchers conceded that their findings weren’t a perfect skeleton key into everyone’s homes. In order to exploit the flaws, an attacker would need to be on the same Wi-Fi network as the vulnerable smart hub. Dardaman said any hub connected directly to the internet would be remotely exploitable. The researchers found five such vulnerable devices using Shodan, a search engine for publicly available devices and databases. Zipato says it has 112,000 devices in 20,000 households, but the exact number of vulnerable hubs isn’t known. We asked SmartRent, a Zipato customer and one of the largest smart home automation providers, which said fewer than 5% of its apartment-owning customers were affected by the vulnerable technology. A spokesperson wouldn’t quantify the figure further. SmartRent said it had more than 20,000 installations in mid-February, just weeks before the researchers’ disclosure. For its part, Zipato fixed the vulnerabilities within a few weeks of receiving the researchers’ disclosure.


The big question now is whether this is a one-time breather in the "Windows as a service" schedule or whether it becomes the new normal for Windows 10 releases. The timing is certainly no accident. The 19H2 Windows 10 feature update is the last Windows 10 release before the end of free support for Windows 7 on January 14, 2020. The last thing Microsoft wants is any kind of discouraging publicity about negative upgrade experiences in those final few months before the free support window closes. This week's announcement is the latest in a series of major changes to the Windows 10 release schedule, including some that represent a 180-degree turnaround from the original "Windows as a service" model. The new rules depend on which Windows edition you've deployed. For OEM and retail Windows editions, even the lowly Windows 10 Home, feature updates are no longer mandatory. Instead, the twice-yearly feature updates are offered on PCs that Microsoft's algorithms deem suitable;


Mac Malware Pushed via Google Search Results, Masquerades as Flash Installer

macOS zero day flaw
The malware was discovered by researchers being distributed via numerous sites – some of which popped up on Google search results. One such site, called “GetComics,” purported to share digital copies of new comic books for free. The malware was also spread via high-ranking Google search results, which were observed redirecting users to multiple sites. “We were actually in the process of coming up with a name for CrescentCore, and searched for ‘CrescentCore’ in quotation marks, and one of the links in the first page of search results redirected to a page that happened to be distributing a new sample of CrescentCore,” Long said. The researcher said that oftentimes malware distributors will find vulnerable blogs or other sites with high Google search engine rankings, and add a redirection mechanism that bounces through a number of affiliate links – ultimately redirecting users to a fake Flash Player landing page. “So if a result for a previously almost unused word like CrescentCore happened to show up in search results, it’s extremely likely that other search results are poisoned with redirections to this malware as well,” he said.


The Future of Anti-Fraud Technology

Benchmarking-Technology-Report-thumb.jpg
Technological advancements present opportunities for both fraud perpetrators and those trying to stop them. As criminals find new ways to exploit technology to commit their schemes and target new potential victims, anti-fraud professionals must ensure they are likewise adopting new technologies that are the most effective in navigating the evolving threat landscape. But which technologies are most effective in helping organizations manage their fraud risk? Which tools provide benefits that outweigh the costs? How are organizations successfully harnessing the power of data and technology as part of their anti-fraud programs? These issues were discussed in today’s roundtable session, “Benchmarking Your Use of Anti-Fraud Technology.” The answers to those and other questions can be crucial in gaining management buy-in and successfully implementing new anti-fraud technologies. Luckily, attendees were able to review the recently released Anti-Fraud Technology Benchmarking Report. A publication of the ACFE, the report was developed in partnership with SAS and serves as a road map to what the future technological landscape will hold.



Neuroscience Is Going to Change How Businesses Understand Their Customers


Neuroscience, it turns out, can help change how companies think about new opportunities, and specifically, within the emerging field of applied neuroscience. Applied neuroscience is best described as the use of neuroscience tools and insights to measure and understand human behavior. Using applied neuroscience, leaders are able to generate data about critical moments of decision making, and then use this data to make confident choices that help to navigate the future of an initiative. Studies using applied neuroscience are often conducted outside of a lab context, and therefore rarely use large, stationary MRI (magnetic resonance imaging) scanners. Instead, these studies focus on using more mobile solutions such as EEG (electroencephalograph) headsets, combined with eye-tracking technology to capture precise data on how the brain reacts when presented with certain scenarios. Because of this, applied neuroscience is used primarily during one of two points in a new project — either at the onset while defining the business problem, or later in the cycle while seeking new solutions for users.


AI Differential Privacy and Federated Learning


Differential Privacy enables us to quantify the level of privacy of a database. This can help us to experiment with different approaches in order to identify which is best to preserve the user’s privacy. By knowing our data privacy level we can then quantify the likelihood that someone might be able to leak sensitive information from the dataset and how much information can be leaked at most. ... Machine Learning models which make use of a large amount of data are traditionally trained using online servers. Companies like Google and Apple used to take the data records activity of the users of their mobile devices and then store them in their cloud services to create a centralized Machine Learning model able to improve the performances of their mobile services. Nowadays, these big companies are moving instead towards using a decentralized model approach called Federated Learning. Using Federated Learning, the Machine Learning model is trained on the data source and its output is then moved on the cloud for further analysis. This means that companies like Google and Apple no longer need to access their user's data to improve their services


Open Banking is less about technology, more about people


Key element is to be able to translate the opportunities of new technology to all stakeholders in the ecosystem. No, a bank will not become an ecosystem… it is an integral part of it. This means that collaboration is an essential part of the success, not imposing partnerships. This balanced partnership will (1) allow more creative thinking and (2) facilitate change management, as people are more open to change when dialogue is not top-down. To come back to the idea of the customer challenge: with partnerships I do not only mean technology partners, but also for example merchants and other parties that can help in convincing the end-customer to adapt to the new world of digital banking. Financial education is a key aspect of getting the market ready for Open Banking. You don’t need to explain what Open Banking is, but it may be useful to show the advantages of the new way of banking. Banks no longer have a monopoly on this financial education. To take the example of the wearables: it is good to have banks promoting it, but if the merchant does not promote it, the bank will get nowhere in volumes: it is a two-way street.


Thousands of Facebook Users Hit in Malware Distribution Campaign

Researchers from Check Point Software uncovered the campaign recently when investigating a Facebook page impersonating Khalifa Haftar, commander of the Libyan National Army. The page, created in April, offered posts about airstrikes, terrorists being captured, and other content likely of interest to people in Libya. With more than 11,000 followers, the page contained URLs for downloading files that were often described as documents containing evidence of countries like Qatar and Turkey conspiring against Libya, or containing photos of pilots captured when bombing Tripoli and other lures. Some URLs purported to be to sites where citizens could sign up for the army. Facebook users on mobile and desktop devices who clicked on these links ended up downloading a variety of known remote administration tools used for spying and stealing data. Check Point's investigation of the fake Khalifa Haftar Facebook page shows that the individual behind it had been distributing malicious links through more than 30 other Facebook pages since at least 2014.


The Role of Big Data Analytics and AI in the Future of Healthcare


Machine learning algorithms can identify patterns and make predictions using cloud computing data lakes and data warehouses that clean (creating a single ‘source of truth’ in the data) and store huge amounts of data which enables the integration of multiple health care systems together. For the purpose of providing better and more targeted care to an individual’s electronic health record.  Oncology and cancer research have gotten the most investment into precision medicine by studying cancer genetics. In some instances, cancer treatments can be suggested based on genetic drivers of the cancer and not on the physical location of the cancer itself within the patient’s body. Moffitt Cancer Centerin Tampa, FL has been working on incorporating molecular genomics, demographics and trial outcomes to develop models for each patient. The use case. Of precision medicine will not only lower the risks of using sometimes incompatible treatments and medicines but it also building a new solutions for fighting disease and delivering healthcare. A startup called Deep Genomics uses (AI) and the genome to determine the best drug therapies for each individual.


How DHL is securing 'the world's most international company'

The aim? To "integrate and interoperate" with critical security technology components to better “identify, visualise and prioritise” critical security information in near real-time, in addition to providing timely remediation and responses to reduce possible business disruptions from cyber attacks. “With an influx of emerging and disruptive technologies such as ML, AI and the Internet of Things (IoT), organisations need to attain high levels of confidence in cybersecurity to compete and dominate in the digital space,” explained Chim. “Cybersecurity, rather than being a blocker or damage controller, has become a prioritised commercial investment for several businesses." “Organisations dealing with digital transformation in any form are enforcing cybersecurity in every technology surface to ensure secure operations and meeting data privacy compliance.” At the same time, Chim said cybersecurity industries are also adopting AI, sensor and blockchain solutions to collect, analyse and enrich significant number of events and intelligence to better prevent, detect and respond to security threats.



Quote for the day:


"Managers work to see numbers grow. Leaders work to see people grow." -- Simon Sinek


Daily Tech Digest - July 03, 2019

How serverless computing makes development easier and operations cheaper

How serverless computing makes development easier and operations cheaper
Two of the biggest benefits of serverless computing should be clear: developers can focus on the business goals of the code they write, rather than on infrastructural questions; and organizations only pay for the compute resources they actually use in a very granular fashion, rather than buying physical hardware or renting cloud instances that mostly sit idle. As Bernard Golden points out, that latter point is of particular benefit to event-driven applications. For instance, you might have an application that is idle much of the time but under certain conditions must handle many event requests at once. Or you might have an application that processes data sent from IoT devices with limited or intermittent Internet connectivity. In both cases, the traditional approach would require provisioning a beefy server that could handle peak work capacities—but that server would be underused most of the time. With a serverless architecture, you’d only pay for the server resources you actually use. Serverless computing would also be good for specific kinds of batch processing.



Tempered Networks simplifies secure network connectivity and microsegmentation

Tempered Networks simplifies secure network connectivity
The TCP/IP protocol is the foundation of the internet and pretty much every single network out there. The protocol was designed 45 years ago and was originally only created for connectivity. There’s nothing in the protocol for security, mobility, or trusted authentication. The fundamental problem with TCP/IP is that the IP address within the protocol represents both the device location and the device identity on a network. This dual functionality of the address lacks the basic mechanisms for security and mobility of devices on a network. This is one of the reasons networks are so complicated today. To connect to things on a network or over the internet, you need VPNs, firewalls, routers, cell modems, etc. and you have all the configurations that come with ACLs, VLANs, certificates, and so on. The nightmare grows exponentially when you factor in internet of things (IoT) device connectivity and security. It’s all unsustainable at scale. Clearly, we need a more efficient and effective way to take on network connectivity, mobility, and security.


SOCIAL HUB -The latest trends on network transformation all in one place.

SA and NSA Network LTE
It’s reasonable to expect the nature of 5G services to change rapidly. A software-defined network infrastructure is flexible enough for CommSPs to speed the roll out of customizable applications and service models, ease customer provisioning and improve network operation and management efficiency. Staying in lock-step with evolving 3GPP specifications and 5G services implementation will require flexibility that only software-based infrastructure can provide. For example, one major US CommSP shared early plans for tiered 5G pricing based on data speeds, similar to broadband Internet pricing plans. The ability to support custom charging and new mobility service scenarios will be key to establishing, testing and evolving pricing structures and business models. ... Network engineers and architects can deploy these servers in the core or network edge, which makes it possible to scale to multi-terabit configurations in the core network and share consistent infrastructure and software with distributed locations. CommSPs can use this network infrastructure to apply a cloud native architecture that drive efficiencies, speed deployments and meet SLA requirements that have very different requirement from the cloud computing industry.



TA505 Group Launches New Targeted Attacks

The evasion and anti-analysis capabilities built into modern malware tools like AndroMut highlight the need for multilayered protections. In addition to securing emails and endpoint devices, organizations need to monitor for malware communication with command-and-control systems, Dawson notes. For enterprises, the threat posed by TA505 appears to be growing, according to Proofpoint. The group is behind some of the largest email campaigns ever, including one to distribute the Locky ransomware. Through 2017 and the first half of 2018, TA505 launched such massive campaigns that they dramatically affected global malicious email volumes, Dawson says. "The group saturated organizations with Locky ransomware and the Dridex banking Trojan," he notes. When TA505 shifted to smaller — though still relatively large — campaigns distributing RATs and other malware, it triggered a similar shift in this direction among other attackers that continues today, Dawson says.


The 'Going Dark' Debate: It's Back

The 'Going Dark' Debate: It's Back
The impetus, as usual, is law enforcement and intelligence agencies' concern over "going dark." In other words, suspects in an investigation - centering on child abuse, terrorism, drug trafficking or any other type of criminality - might be using communications techniques on which investigators cannot easily eavesdrop. The NSC advises the president on national security matters and coordinates policies across government departments. Last week's gathering of the NSC's Deputies Committee, three unnamed people with knowledge of the meeting told Politico, does not appear to result in any decision to change current policies. "The two paths were to either put out a statement or a general position on encryption and [say] that they would continue to work on a solution, or to ask Congress for legislation," one of the people told Politico. One of the chief proponents for anti-crypto legislation was Deputy Attorney General Rod Rosenstein, but with his departure, the appetite for legislation meant to tackle the "going dark" problem has appeared to wane, Politico reports.


Disaster recovery readiness is essential for hybrid and multi-cloud strategies

disaster recovery readiness
Use of the cloud within a disaster recovery plan offers many benefits, including reliability and cost efficiency, as there is no need to invest in infrastructure that may never be used. Cloud resources can be offsite, mitigating the risk of a disaster affecting the main office location, and can be accessed (and paid for) only as needed. A multi-cloud disaster recovery strategy offers additional peace of mind that critical systems and data will remain easily accessible when needed. Although hybrid and multi-cloud deployments are widely acknowledged as good practice, IT professionals highlight complexity, training gaps and lack of internal resources in their hesitancy to deploy using multiple clouds. Nevertheless, more than half the respondents were operating in a multi-cloud environment, with nearly one in ten using five or more clouds within their organizations. “What we’re hearing from customers, and is consistent with our survey findings, is that they’re looking for ways to simplify and streamline their cloud deployment and management,” said Ziad Lammam, vice president of product management for Teradici.


Why AI Will Replace Rocket Scientists Before It Ever Replaces Marketers


According to world-renowned inventor and futurist Ray Kurzweil, looking at AI as a threat is unnecessary. Instead, humans should embrace technological advancements and allow them to, in turn, make us smarter. Machine learning has come a long way in recent years. AI algorithms have been honed and perfected, enabling machines to learn and update on their own. While this has affected all walks of life, when it comes to marketing, AI has helped improve the customer experience exponentially. In today’s day and age, consumers expect companies to always be on, and they expect messages to be personalized. AI helps marketers achieve this level of personalization without having to work 24/7. It’s ironic because this automated, mechanical tool is making marketing more personalized and human. ... This is where a marketer's touch and human intelligence come to play. At this point, and in the near future, there will still be a need for a human marketer behind AI tech to help steer the campaign in the right direction. If you're looking for evidence of this, consider the many issues that came to light as programmatic advertising gained traction.


Robot maps a room using just sound and AI

Humanoid robot
The researchers note that the shape of a room can be acoustically determined from corresponding room impulse responses (RIR), which can be extracted from recorded sound signals. Exploiting this fact, they considered time of arrivals (TOAs), or the time it takes for sound to travel from a source to a microphone. If the TOAs are known, they posited, the distance from the microphone to a target location can be inversely computed. But knowing TOAs isn’t enough, because the distances are unlabelled and acoustic sensors record reflections and echoes in an arbitrary order. To solve for this, the team tapped a four-microphone array (the fourth microphone was used to verify the distances) and used a reflective point, which in this case refers to the intersection between the line from a target spot in the room to a microphone and a potential wall line. If the reflective point and the real sound source were on different sides of a reconstructed or potential wall line, the proposed system treated the target spot as noise and discarded the data.


Origami Inspired Robot Can Pack Your Groceries

Origami robot
The latest soft robot from MIT’s Computer Science and Artificial Intelligence Laboratory (CSAIL), however, breaks the mould. Appearing more like a flower than a piece of machinery, CSAIL’s Origami Robot Gripper is a collapsible skeleton that can suck up objects using a vacuum. Its rubber skin aids with grip, allowing the robot to pick up items from any angle regardless of their shape. Currently, ‘hard’ robots struggle with non standard shapes and, unlike the Origami inspired bot, can easily apply too much or too little force. Another benefit of the soft robot is that it is lightweight, and made with relatively inexpensive materials. This means it is cheaper and, thanks to a simple design, less complex to make. And, instead of requiring extensive programming to handle different shapes and sizes, its vacuum can pick up a variety of products from mushrooms to bottles of wine. It’s also capable of lifting 100 times its own weight. Scale up the design, and the Origami gripper could reliably retrieve a whole range of items. The most obvious application for MIT’s model is in groceries, either at physical checkouts or in warehouses.


More US Cities Battered by Ransomware

Despite advice from the FBI that organizations should not pay ransoms, the decision is increasingly being looked at from a cost/benefit perspective. Insurance policies may cover ransoms, and the option may look appealing if the cost of recovery is more than the ransom. And as ProPublica reported last month, some forensics firms that claim to be able to resolve a ransomware infection are actually paying the ransom while passing the cost onto their customers. Plus, there's the vexing question over who is profiting from the ransom. ProPublica traced four ransom payments made by Proven Data Recovery, a firm based in New York. The payments - made to get the decryption key for a SamSam infection - ended up in bitcoin wallets linked to Iran. The city of Baltimore, however, refused to pay a ransom after a recent attack and endured an estimated $18 million in recovery costs. The city was affected by the Robbinhood ransomware, which forced the city to revert to manual processes



Quote for the day:


"Great achievers are driven, not so much by the pursuit of success, but by the fear of failure." -- Larry Ellison


Daily Tech Digest - July 02, 2019

TIN coalition calls for industry action against cyber fraud


The vision for overcoming social engineering challenges is to reduce the opportunities to establish false trust and to ensure that all remaining threats are well publicised and understood. The vision also requires organisations to interact with customers and staff in a way that reinforces security and to ensure that the security of interactions with individuals becomes less dependent on public information. To address operating in silos, the vision is to ensure that cyber fraud is understood across functions within and between organisations, to ensure that organisations are recognised for sharing useful information, not punished for suffering an attack, and to ensure that business and law enforcement collaborate effectively to tackle cyber fraud. And to reduce the gap between cyber security and anti-fraud operations, the vision is to ensure that the response to cyber attacks minimises the broader impact of data loss on society, that fraud teams in business and law enforcement are fully engaged in tackling cyber attacks as a precursor to fraud, that enforcement is globalised to tackle all forms of cyber fraud


Big Data Is Dead. Long Live Big Data AI.

Getty
“The value of the data analytics market can’t be ignored. The Looker and Tableau acquisitions demonstrate that even the biggest tech players are snapping up data analytics companies with big price tags, clearly demonstrating the value these companies have in the larger cloud ecosystem. And in terms of what this means for the evolution of AI, we’ve reached a point where we have more than enough anonymized data to train the system, and now it’s a matter of honing how we use the AI to extract the maximum value from data”—Amir Orad, CEO, Sisense “The Google Cloud/Looker and Salesforce/Tableau acquisitions are a direct reaction to the rate at which analytics workloads have been shifting to the cloud over the past few years. The state of AI is a reflection of this shift as machine learning, AI and analytics have become the primary growth opportunities for the cloud today. Yet, it's this same growth that is causing barrier to success as AI project overwhelming face the same problem -- data quality”—Adam Wilson, CEO, Trifacta


What can you do with the Microsoft Graph?


Working with the APIs can be tricky; it can be hard to construct the right query, especially if you're looking for more complex graph queries. Microsoft offers tools to help build and test queries, as well as SDKs that can simplify adding Graph support to your apps. One, the web-based Graph Explorer, allows you to try out queries without logging in to an Office 365 account. It provides sample queries that show how to extract specific information from the service, with a library of different queries to get started. You can only use GET queries against sample data; POST requires your account details and your data. Once you're ready to start working with live data, you can log in with a Microsoft account, and start using your Microsoft 365 tenant. The list of query categories is long, covering working with users, with mail and calendar, as well as files and apps. The Graph Explorer doesn't only show production queries, it supports beta APIs, so you can experiment before adding them in your code. Queries can be cut-and-pasted from the Explorer, and you can see any request headers or bodies that need to be constructed and delivered with the REST HTTP query.



Offensive Security launches OffSec Flex, a new cybersecurity training program

Organizations can now use OffSec Flex to purchase blocks of Offensive Security’s industry-leading practical, hands-on training, certification and virtual lab offerings, allowing them to proactively increase and enhance the level of cybersecurity talent available within their organizations. With Offensive Security’s hands-on courses, labs and exams readily available, organizations are able to offer educational opportunities to new hires and non-security team members alike, improving their security posture and equipping their employees with the adversarial mindset necessary to protect modern enterprises from today’s threats. “Cybersecurity training is not just for security professionals anymore,” said Kerry Ancheta, VP of Worldwide Sales, Offensive Security. “Increasingly we see organizations recommend pentest training courses for their software development or application security teams in order to improve their understanding for how their systems and applications are attacked.


Calculating The Cost of Software Quality in Your Organization


Basically, the costs of software quality (COSQ) are those costs incurred through both meeting and not meeting the customer’s quality expectations. In other words, there are costs associated with defects, but producing a defect-free product or service has a cost as well. Calculating these costs serves the purpose of identifying just how much the organization spends to meet the customer’s expectations, and how much it spends (or loses) when it does not.  Knowing these values allows management and team members across the company to take action in ensuring high quality at a lower cost. While analyzing the COSQ at an organization may lead to the revelation of uncomfortable truths about the state of quality management at the company, the process is important for eliminating waste associated with poor quality. This often requires a mindset and culture shift from viewing software quality defects as individual failures to seeing them as opportunities to improve as a collective team.


Machine learning has been used to automatically translate long-lost languages


It’s not hard to imagine that recent advances in machine translation might help. In just a few years, the study of linguistics has been revolutionized by the availability of huge annotated databases, and techniques for getting machines to learn from them. Consequently, machine translation from one language to another has become routine. And although it isn’t perfect, these methods have provided an entirely new way to think about language. Enter Jiaming Luo and Regina Barzilay from MIT and Yuan Cao from Google’s AI lab in Mountain View, California. This team has developed a machine-learning system capable of deciphering lost languages, and they’ve demonstrated it by having it decipher Linear B—the first time this has been done automatically. The approach they used was very different from the standard machine translation techniques. First some background. The big idea behind machine translation is the understanding that words are related to each other in similar ways, regardless of the language involved.


The Agile Manifesto: A Software Architect's Perspective

Specifications with an architectural impact (in the form of new user stories) should be tracked by the architect and assessed in a pragmatic approach by the whole development team, including experienced developers, test engineers, and devops. Bad habits from the past, when the architect created on paper the full blown technical design for the team, do not fit within modern agile environments. There are multiple flaws with this model, which I also faced in my daily basic work. First and most important, the architect might be wrong. This happened to me after I created a detailed upfront technical design and presented it to development team during Sprint refinements. I got questions related to cases I did not think about or I failed to take into account. In most of the cases, it turned out the initial design was either incomplete or impractical, and required extra work. Big upfront design limits the creativity and autonomy of the team members, since they must follow a recipe which is already granted. From a psychological standpoint, even the author might become biased and more reluctant to change it afterwards, trying to prove it is correct rather than to admit its flaws.


Essential tips for scaling quality AI data labeling


Data scientists are using labeled data and natural language processing (NLP) to automate legal contract review and predict patients who are at higher risk of chronic illness. The success of these systems depends on skilled humans in the loop, who label and structure the data for machine learning (ML). High-quality data yields better model performance. When data labeling is low quality, an ML model will struggle to learn. According to a report by analyst firm Cognilytica, about 80 percent of AI project time is spent on aggregating, cleaning, labeling, and augmenting data to be used in ML models. Just 20 percent of AI project time is spent on algorithm development, model training and tuning, and ML operationalization. These tasks are at the heart of AI development and require strategic thinking, along with a more advanced set of engineering or computer science skills. It’s best to deploy more expensive human resources — such as data scientists and ML engineers — on tasks that require expertise, collaboration, and analytical skills.


Effective or Not? The Real Impact of GDPR


The General Data Protection Regulation wasn’t just meant to give governments the means to enforce data security rules. Another key objective was to change how both companies and users behave when it comes to ensuring personal data remains private and protected. In this sense, GDPR seems to have had the desired impact. ... Another interesting fact the data shows is that users may have moved some of their own responsibility to GDPR enforcers. Two indicators led to this observation: “Respondents are less likely to read privacy statements than they were in 2015 (-7 percentage points) “17% say it is enough for them to see the website has a privacy policy” so they choose not to read the document at all. A similar behavior pattern emerges when dealing with social media usage. Less users – 56% in 2019 vs 60% in 2015 – actually change their privacy settings for their personal profile. The three most common reasons social network users give for not trying to change their personal profile’s default settings are that they trust the sites to set appropriate privacy settings (29%) that they do not know how to (27%), or that they are not worried about sharing their personal data (20%).


5 steps for digital workplace transformation


Start by recognizing actionable opportunities within your business operations. Approach the prospects for digital transformation from a business instead of technology perspective. Line-of-business (LOB) teams should lead this effort, coordinating closely with senior IT staffers to identify critical barriers to success. Of course, each organization faces its own set of challenges. But, at the onset, step back and identify key themes -- accelerating innovation, enhancing productivity, improving governance or reshaping the steps in the customer journey -- that make good business sense. Consider operations as a whole, while focusing on people and processes, and determine your target audiences: employees, partners and/or customers. Then, engage a cross section of these audiences in conversations about what they are doing and how they understand the underlying business purposes. Develop both the technology and the business insights about what is happening from the participants' perspectives. Listen carefully as they describe their tasks, and be sure to observe how they do their work to determine where bottlenecks occur.



Quote for the day:


“The real voyage of discovery consists not in seeking new landscapes but in having new eyes.” -- Marcel Proust


Daily Tech Digest - July 01, 2019

Automation Is Becoming A C-Suite Priority

Automation is becoming a C-Suite priority - CIO&Leader
While automation maturity is at its highest in the US, with over 60% of organizations making extensive use of automation, there are some interesting findings from India. The country shows the maximum level of enthusiasm about automation among CIOs and other senior executives. 84% believe RPA is a high or essential priority to meet strategic business objectives for Indian businesses as against the global average of 76%. Also 90% C-level executives expect their company’s financial results to improve as a result of automation, namely profitability, operating costs and revenue growth. Sector wise, IT and manufacturing have outpaced other industries in automating business processes. By contrast, government and public sector institutions have made the least headway among surveyed sectors. Of CIOs who have implemented automation, most have automated highly repetitive back-office functions. “Automation of functions is most extensive in IT, operations and production, customer service and finance.



FTC data privacy enforcement will threaten corporate bottom lines


Despite the mounting concerns over data security and privacy practices that put consumers’ data at risk, the U.S. Congress still has yet to adopt national legislation to address cybersecurity, and security spending will see a nominal increase given the current administration’s recent budget proposal. Consequently, organizations are subject to a patchwork of laws and regulations relevant to cybersecurity and privacy practices, including differing laws and regulations in each state and the District of Columbia, as well as from multiple federal administrative agencies. Therefore, the FTC has taken a comprehensive directive to extend its supervision over all companies operating in the United States. In fact, the FTC has assumed a leading role in policing corporate cybersecurity practices since 2002. Since that time, it brought more than 200 cases against companies for unfair or deceptive practices that endangered the personal data of consumers.



Huge jump in cyber incidents reported by finance sector


Overall, Snaith said there remain serious vulnerabilities across some financial services businesses when it comes to the effectiveness of their cyber controls. “More needs to be done to embed a cyber resilient culture and ensure effective incident reporting processes are in place,” he said. UK law enforcement is also calling for improvements in cyber crime reporting. “It is crucial that businesses report cyber crime to us because every incident is an investigative opportunity,” Rob Jones, director of threat leadership at the UK National Crime Agency (NCA) told Computer Weekly. “Failure to report creates an unpoliced space and a situation where incident response companies just sweep up the glass, but don’t deal with the underlying issue, which emboldens criminals,” he said. “As a result, the problem will continue and prevalence, severity and sophistication of attacks will increase.” Nigel Hawthorn, data privacy expert at security firm McAfee said that it is widely recognised that cyber incidents were previously under-reported. 


How does the CVE scoring system work?

securityhero.jpg
The first thing to understand is that there are three types of Metrics used in this system: Base Score Metrics - depends on sub-formulas for Impact Sub-Score (ISS), Impact, and Exploitability; Temporal Score Metrics is equal to a roundup of BaseScore * EploitCodeMaturity * RemediationLevel * ReportConfidence; and Environmental Score Metrics - depends on sub-formulas for Modified Impact Sub-Score (MISS), ModifiedImpact, and ModifiedExploitability. The formula for this is Minimum ( 1 - [ (1 -ConfidentialityRequirement * ModifiedConfidentiality) * (1 - IntegrityRequirement × ModifiedIntegrity) * (1 - AvailabilityRequirement * ModifiedAvailability) ], 0.915). Within each set of metrics are the following sub-categories: Base Score Metrics: Attack Vector, Attack Complexity, Privileges Required, User Interaction, Scope, Confidentiality Impact, Integrity Impact, Availability Impact; Temporal Score Metrics: Exploitability, Remediation Level, Report Confidence; and  Environmental Score Metrics: Attack Vector, Attack Complexity, Privileges Required, User Interaction, Scope, Confidentiality Impact, Integrity Impact, Availability Impact, ...


AI is changing the entire nature of compute


"Hardware capabilities and software tools both motivate and limit the type of ideas that AI researchers will imagine and will allow themselves to pursue," said LeCun. "The tools at our disposal fashion our thoughts more than we care to admit." It's not hard to see how that's already been the case. The rise of deep learning, starting in 2006, came about not only because of tons of data, and new techniques in machine learning, such as "dropout," but also because of greater and greater compute power. In particular, the increasing use of graphics processing units, or "GPUs," from Nvidia, led to greater parallelization of compute. That made possible training of vastly larger networks than in past. The premise offered in the 1980s of "parallel distributed processing," where nodes of an artificial network are trained simultaneously, finally became a reality.  Machine learning is now poised to take over the majority of the world's computing activity, some believe. During that ISSCC in February, LeCun spoke to ZDNet about the shifting landscape of computing. 


SOLID Principles: Interface Segregation Principle (ISP)

Image 2 for SOLID Principles: Interface Segregation Principle (ISP)
A great simple definition of the Interface Segregation Principle was given in the book you have already heard of, “Agile Principles, Patterns, and Practices in C#”. So, the definition is:“The Interface Segregation Principle states that Clients should not be forced to depend on methods they do not use.” ... Here is an interesting historical note about the ISP. I’m pretty sure that ISP was first used long ago before Robert Martin, but the first public formulation belongs to Robert C. Martin. He applied the ISP first time while consulting for Xerox. Xerox had created a new printer system that could perform a variety of tasks such as stapling and faxing. The software for this system was created from the ground up. As the software grew, making modifications became more and more difficult so that even the smallest change would take a redeployment cycle of an hour, which made development nearly impossible. The redeployment cycle took so much time because at that time there were no C# or Java, these languages compile very fast. What we can’t say about C++ for example. Bad design of a C++ program can lead to significant compilation time.


Mist Wi-Fi no longer just cloud


The Mist Edge hardware appliance avoids having access points (APs) in each office on a campus communicate directly with Mist's WxLAN technology in the cloud. Instead, WxLAN policies created through Mist's cloud-based dashboard are stored in the on-premises appliance. WxLAN policies assign resources, such as servers and printers, to groups of users. Network managers can also create a service set identifier for a select group of users and assign services or devices that only they can access. Mist Edge is available only as a stand-alone appliance. Mist plans to ship the appliance's software on a virtual machine this year. Mist Edge reflects the preference of some enterprises to split management technology between the cloud and on premises. Companies more comfortable with an on-site WLAN controller, for example, could switch to Mist Edge, said Brandon Butler, an analyst at IDC. "Overall, we see more and more enterprises gaining comfort with managing their WLAN environments from the cloud but giving customers a choice in how to manage their environments is always good," he said.


Beyond Limits: Rethinking the next generation of AI

A human profile containing digital wireframe of technology connections.
Beyond Limits evolved out of work with NASA's Jet Propulsion Laboratory (JPL) for remote rovers used to explore places like the moon and Mars. Due to the communications lag in space, real-time control is virtually impossible. Any AI solution must be not only fully autonomous, it must be able to train and, ideally, correct itself. When there is a problem it can’t correct, the bandwidth limitations for communication make full reprograming problematic…but point patches are certainly possible. This resulted in an AI platform uniquely able to be updated, modified and, to a certain and initially limited extent, able to both teach itself and make corrections while disconnected. This unusual requirement likely has made the resulting AI nearly ideal for areas where the AI must often act independent of oversight – and/or in areas where problems can escalate very rapidly – and the AI must be able to both deal with a diversity of known and unknown issues. ... Although still its infancy, Beyond Limits represents a new class of AI. It’s better enabled to operate fully autonomously, it can both learn on the fly and increasingly make corrections to its own programing


HPE promises 100% reliability with its new storage system

HPE promises 100% reliability with its new storage system
Primera was announced last week at HPE’s Discover event in Las Vegas. Phil Davis, chief sales officer for HPE, said in the announcement keynote, “If you think about traditional storage, it’s full of compromises and complexity. Do I want fast or reliable? Do I want agility or simplicity? But not any more. We’re going to combine the simplicity of Nimble with the intelligence of Infosight and mission-critical heritage of 3Par and we’ve created a new class of storage that eliminates the traditional compromises and truly redefines what is possible with storage.” Davis said Primera will run out of the box with just a few cable connections and be can be autoprovisioning storage within 20 minutes. That means no need for IT consultants to install and configure the hardware. The more workloads you add to a storage system, the more unpredictable latency becomes. Using InfoSight’s parallelism, Primera improved throughput and latency of an Oracle database by 122% over the prior storage system, which HPE did not identify.


Using AI-powered intelligent automation for digital transformation success

A maturity model assessment begins with evaluating automation readiness from a technology and process perspective. IT should be involved in the discussion early on because they understand how automation technologies will fit within the larger IT framework. They’re also responsible for managing the environments that these technologies operate in and for ensuring proper security protocols are followed throughout the deployment process. From a business process and operations standpoint, organizations should assess how well-documented current processes are during this stage. If there’s room to improve prior to automation, this presents an opportunity to make upfront investments in this respect. Automation is most powerful when deployed against processes that are already running properly; it isn’t intended to ‘fix’ or alleviate the pain points around broken processes. In other words, optimize first and then automate for the best results.



Quote for the day:


"One of the sad truths about leadership is that, the higher up the ladder you travel, the less you know." -- Margaret Heffernan