Daily Tech Digest - January 16, 2018

Data Management 2018
A lot is happening it the market today, as data continues to explode at unprecedented levels. Compliance is no longer an option, but a requirement. The push to cut costs and embrace the multi-cloud – yet still maintain visibility of your data – has never been more critical. At the same time, the need to safeguard against data breaches is an absolute must. And, the necessity to gather as many insights from your data as possible could be the difference between success and failure for many organisations. So, what does all of this mean for today’s CIOs and IT decision makers? Here are the top five predictions for the coming year. For those who see challenges as opportunities, this could be an exciting year for you. ... It is expected that new data valuation techniques to get a boost from AI to reshape information lifecycle management through the automation of policy enforcement and more intelligent data management actions.


What is Zero Trust? A model for more effective security

The Zero Trust model of information security basically kicks to the curb the old castle-and-moat mentality that had organizations focused on defending their perimeters while assuming everything already inside didn’t pose a threat and therefore was cleared for access. Security and technology experts say the castle-and-moat approach isn’t working. They point to the fact that some of the most egregious data breaches happened because hackers, once they gained access inside corporate firewalls, were able move through internal systems without much resistance. “One of the inherent problems we have in IT is we let too many things run way too openly with too many default connections. We essentially trust way too much,” Cunningham says. “That’s why the internet took off – because everyone could share everything all the time. But it’s also a key fail point: If you trust everything, then you don’t have a chance of changing anything security wise.”


Microsoft Stresses Security, Responsible AI in Cloud Policy Updates


In the 2018 update, Microsoft is tackling some of the negative consequences of using AI and other technologies based on the tumultuous year the IT industry experienced in 2017. "We continue to witness cyber-attacks by nation-states on citizens, critical infrastructure and the institutions of democracy. We read on an almost daily basis about the criminal hacking of companies and governments to steal private and sensitive information of customers," wrote Smith in the 2018 update to the e-book. "We listen to the concerns about the loss of jobs to automation and the disruptive impact of artificial intelligence (AI) on entire sectors of the economy." In terms of cyber-security, Microsoft continues to honor its commitment to spend $1 billion in the IT security field each year. If necessary, the company is poised to use legal means to disrupt nation-state attacks.


Cloud computing: Three strategies for making the most of on-demand

"It's still complicated," says Marks, before suggesting more organisations will continue to move from an exploratory stage through to full adoption. "The difference today is that the cloud is understood for its different dimensions, be that at the level of the infrastructure, the platform or services. The migration of applications and data from the server downstairs to the public cloud is a shift that continues. The key point is that it's hard to think of a sensible reason why a CIO would buy hardware ever again -- and that should be your starting point." ZDNet speaks with three IT leaders at different stages of the cloud adoption process: exploring, transforming, and pioneering. Evidence from these three stages suggests cloud-led change remains a work in progress, where smart IT leaders assess their business context and provide an on-demand solution that can flex with future requirements.


Enterprise software spending set to grow thanks to AI and digital boost


“Looking at some of the key areas driving spending over the next few years, Gartner forecasts $2.9tn in new business value opportunities attributable to AI by 2021, as well as the ability to recover 6.2 billion hours of worker productivity,” he said. “Capturing the potential business value will require spending, especially when seeking the more near-term cost savings. “Spending on AI for customer experience and revenue generation is likely to benefit from AI being a force multiplier – the cost to implement will be exceeded by the positive network effects and resulting increase in revenue.” Gartner forecast a slight increase of 0.6% in datacentre spending in 2018 compared with 2017, but predicted a decline of 0.2% in 2019. As Computer Weekly has reported previously, this may be related to the increase in SaaS and cloud-based services.


Lessons in Becoming an Effective Data Scientist

The first skill that I look for when engaging with or hiring a data scientist is humility. I look for the ability to listen and engage with others who may not seem as smart as them. And as you can see from our DEPP methodology, humility is the key to driving collaboration between the business stakeholders (who will never understand data science to the level that a data scientist do) and the data scientist. Humility is critical to our DEPP methodology because you can’t learn what’s important for the business if you aren’t willing to acknowledge that you might not know everything. Humility is one of the secrets to effective collaboration. Nowhere does the importance of the business/data science collaboration play a more important role than in hypothesis development. If you get the hypothesis and the metrics against which you are going to measure success wrong, everything the data scientist does to support that hypothesis doesn’t matter.


7 Acquisitions that Point to Cloud Maturity


Over the better part of a decade, cloud computing mergers and acquisitions painted a picture where cloud service providers where on the hunt to accumulate as many customers as possible. Thus, you saw massive build-outs of facilities and a haphazard set of mergers and acquisitions that had no real rhyme or reason. We also witnessed incredible price wars when it came to commodity cloud resources in the IaaS and PaaS space. In a nutshell, the cloud service provider market has always about growth by any means necessary. If you contrast previous years' cloud acquisitions with those that have occurred the latter parts of 2017 and into 2018, we start to see a new pattern forming. Sure, there are still signs of significant growth in the cloud space for those looking for bleeding-edge services. Yet at the same time, you see a trend towards stability and cloud acquisition dollars trending toward following what the customer wants in a cloud service -- as opposed to the other way around.


As the cloud’s popularity grows, so does the risk to sensitive data

Despite the prevalence of cloud usage, the study found that there is a gap in awareness within businesses about the services being used. Only a quarter (25%) of IT and IT security practitioners revealed they are very confident they know all the cloud services their business is using, with a third (31%) confident they know. Looking more closely, shadow IT may be continuing to cause challenges. Over half of Australian (61%), Brazilian (59%) and British (56%) organizations are not confident they know all the cloud computing apps, platform or infrastructure services their organization is using. Confidence is higher elsewhere, with only around a quarter in Germany (27%), Japan (27%) and France (25%) not confident. Fortunately, the vast majority (81%) believe that having the ability to use strong authentication methods to access data and applications in the cloud is essential or very important.


Big Data 2018: 4 Reasons To Be Excited, 4 Reasons To Be Worried

Figure 1. TensorFlow Playground offers an interactive sandbox for exploring the foundations of TensorFlow. (Source: Google)
Machine-learning models can accurately perform recognition of specific patterns in data streams. In environments already inundated with data, this capability provides high value and distinct advantages, and the industry has responded accordingly. Data scientists can take advantage of a growing number of open-source machine-learning frameworks including Google’s TensorFlow, Apache MXNet, Facebook Caffe2, and Microsoft Cognitive Toolkit, among others. Most important, the task of building models has never been easier. For example, Amazon Web Services (AWS) offers deep learning AMIs (Amazon Machine Images) with the leading ML frameworks already built in and ready for use on the AWS cloud. For those just starting, Google’s TensorFlow Playground helps users learn more about the neural networks underlying machine learning frameworks, using simple data sets and pre-trained models


Container infrastructure a silver lining amid Intel CPU flaw fixes

Meltdown and Spectre loom over containers
"Most folks running containers have something like [Apache] Mesos or Kubernetes, and that makes it easy to do rolling upgrades on the infrastructure underneath," said Andy Domeier, director of technology operations at SPS Commerce, a communications network for supply chain and logistics businesses based in Minneapolis. SPS uses Mesos for container orchestration, but it is evaluating Kubernetes, as well. Containers are often used with immutable infrastructures, which can be stood up and torn down at will and present an ideal means to handle the infrastructure changes on the way, due to these specific Intel CPU flaws or unforeseen future events. "It really hammers home the case for immutability," said Carmen DeArdo, technology director responsible for the software delivery pipeline at Nationwide Mutual Insurance Co. in Columbus, Ohio.



Quote for the day:


"When we lead from the heart, we don't need to work on being authentic we just are!" -- Gordon Tredgold


No comments:

Post a Comment