Are Programming Languages Key to the Evolution of Machine Learning?
We are at the stage where the data, compute and deep learning algorithms that are absolutely necessary to make AI a reality have all become abundant,” Tutuk said. “But just like the early days of computer technology, the use of state-of-the-art AI is locked out of the reach of millions of developers. At present, even the most popular deep learning frameworks all require a great level of expertise.” There was a similar trend in the early days of developing computer software, she said, and programming languages like Fortran, C, and C++ were easier to use than assembly languages but were still largely inaccessible to most. It was the development of high-level programming languages like Java, Python, and PHP that made computer programming much more widely accessible, around the world. “Without these high-level abstractions, the digital world as we know it today will not exist,” Tutuk said.
These are the top skill sets for a successful blockchain team
Those entering the blockchain development/engineering field should have the mentality of a hacker - or the ability to problem solve collaboratively in a workshop setting when a client presents a business problem. They need to be able to think through the business objectives, implications and value "for each of the participants and then [define] the architecture and overall solution flow," KPMG said. "It is this collaborative approach that leads to a successful application of blockchain." Given the lack of coursework around blockchain and its relatively new existence in the enterprise, a team must be open to exploring and experimenting by "hacking the problem" from a business and IT perspective, according to KPMG. "I'd say at KPMG we've been very successful at taking [employee] skills in-house and upscaling them to deliver blockchain skills," Keele said. "Until universities start printing blockchain degrees, that will be the pattern that will continue."
Security and privacy key to smart buildings and cities
One of the biggest challenges is the huge number and variety of stakeholders who all have a role to play and need to work in collaboration. These include building owners, property developers, landlords, building occupants, architects, technology suppliers, building services engineers, town planners, chief security officers, chief information security officers, data protection officers and more. At the core of the security problem is the fact that many of the systems that smart building and cities will need to rely on will be linked to a wide variety of internet of things-connected (IoT) devices and sensors that are potentially vulnerable to cyber attacks. The whitepaper underlines the importance of considering and evaluating cyber security throughout the whole supply chain to protect data, maintain privacy and keep risk associated with cyber threats to a minimum. According to the whitepaper, this process should always start by looking at device security and the supplier’s cyber maturity.
How Developers Can Learn the Language of Business Stakeholders
Business stakeholders are not your enemies; once they have enough sound information on where we are and what we expect to happen shortly, they are willing to accept reasonable requests or decisions - even additional learning time which team members may require. What you can do is approach the opportunities for learning using the learning curve effect. Although this is something we should not avoid, I often see this element being skipped over when planning a project or forecasting work. We tend to translate the metrics and statistics from the stable state to the initial phases, when the team is still formulating. Similarly, this applies to the new person in a role, who needs to learn not only her place in the project, but very often new responsibilities. Let’s keep this in mind when planning work or identifying impediments. You can read more on the learning curve effect in a separate article I wrote on the topic, Never stop learning – why is learning curve effect so powerful?
Enterprise architect role is more about business than ever
"In the past that's been somewhat separated," Nelson said. "There might have been dedicated business architects running around that live on the business side that may or may not interact with EA." He said that in a similar vein, CIOs are now frequently called to the overall business strategy table, and that trend is dragging all of IT -- particularly enterprise architects -- in the same direction. But these organizations need more than just a general IT liaison. Now that businesses put so much value on their digital strategy, they need constant input from architects that possess an intimate understanding of their software capabilities and can shape development practices to meet specific business needs. Aslinn Merriman, emerging technology architect at Sargento, Inc., a large food production company based in Plymouth, Wisconsin, agreed that the architect's purpose is to help set a strategy and facilitate development goals that align with other business units and the overall organization.
US Cyber Command Warns of Outlook Vulnerability Exploits
While the warning from Cyber Command did not offer many details, some security researchers, including analysts with Chronicle - the cybersecurity arm of Alphabet - suspect that this latest attack is related to the activity of an advanced persistent threat group known as APT33, which also goes by the name Shamoon. In research that FireEye published in 2017, analysts found that APT33 has possible ties to Iranian intelligence and has previously targeted aerospace and energy firms in the Middle East. Over the last two weeks, the U.S. Department of Homeland Security's Cybersecurity and Infrastructure Agency has warned about an increase in Iranian espionage and cyber activity, including increasing use of so-called "wiper" attacks that render computers unusable. One the largest wiper attacks ever recorded targeted the oil giant Saudi Aramco in 2012. In that case, the attackers used malware also called Shamoon, which has appeared in other attacks over the course of the last several years
Facebook open-sources DLRM, a deep learning recommendation model
Facebook AI Research (FAIR) open-sources a lot of its work, but its parent company is making DLRM available for free to help the wider AI community address challenges presented by recommendation engines, like a need for neural networks to associate categorical data with certain higher-level attributes. “Although recommendation and personalization systems still drive much practical success of deep learning within industry today, these networks continue to receive little attention in the academic community,” the paper reads. “By providing a detailed description of a state-of-the-art recommendation system and its open-source implementation, we hope to draw attention to the unique challenges that this class of networks present in an accessible way for the purpose of further algorithmic experimentation, modeling, system co-design, and benchmarking.” The makers of DLRM suggest the model be used for benchmarking the speed and accuracy performance of recommendation engines. The DLRM benchmark for experimentation and performance evaluation is written in Python and supports random and synthetic inputs.
Google debuts Deep Learning Containers in beta
The service, called Deep Learning Containers, can be run both in the cloud or on-premises. It consists of numerous performance-optimized Docker containers that come packaged with various tools necessary to run deep learning algorithms. Those tools include preconfigured Jupyter Notebooks, which are interactive tools used to work with and share code, equations, visualizations and text, and Google Kubernetes Engine clusters, which are used to orchestrate multiple container deployments. The service also provides machine learning acceleration capabilities with Nvidia Corp.’s graphics processing units and Intel Corp.’s central processing units. Nvidia’s CUDA, cuDNN and NCCL machine learning libraries are also thrown in. In a blog post Wednesday, Google software engineer Mike Cheng explained that Deep Learning Containers are designed to provide all of the necessary dependencies needed to get applications up and running in the fastest possible time. The service also integrates with various Google Cloud services, such as BigQuery for analytics, Cloud DataProc for Apache Hadoop and Apache Spark, and Cloud Dataflow for batch processing and streaming data using Apache Beam.
Implementing IoT – overcoming barriers to commercial adoption
The basic architecture of IoT comprises four domains: the sensors, the connectivity of those sensors, the data hub that enables the data from all sorts of sensors to be interoperable (rather than stuck in silos), and the applications. The data hub plays a vital role in presenting the data to the applications in a uniform way, and Davies highlighted the work being done at CityVerve, a smart city demonstrator in Manchester encompassing a smart cycle light trial to understand cycle usage and improve cycle routes, an air quality trail which is linked to traffic density, and a water usage trial for leak management and demand management. Edge computing will play an important role in reducing connectivity demands, and zero-touch device management will be essential. Stuart Higgins, head of smart cities and IoT at Cisco, talked about some of the IoT trials and commercial deployments in the UK and worldwide. Many companies are digitising – seeing their operations and products as data to be managed in an IoT context.
3 serverless development strategies for stateful applications
Functions should be directly accessible to each other. Without immediate connections, functions depend on a slow storage medium to transport data from one function to another, building up latency. In real-time application scenarios -- such as 24/7 monitoring systems -- latency is unacceptable. Serverless functions predominantly underpin short-term workloads, which means that resources are allocated to them when requested and taken away once the request ends. Stateful applications developed on serverless functions can't use traditional mechanisms to work, such as global variables that can hold data throughout the application's lifetime. It's impossible for stateless functions to read from and write to disk, and the application can't maintain a constant connection to the database. To create stateful applications, serverless developers can manage application state with database connections, an event payload or backend as a service (BaaS) to integrate with the application.
Quote for the day:
"A leader is best when people barely know he exists, when his work is done, his aim fulfilled, they will say: we did it ourselves." -- Laotzu
No comments:
Post a Comment