Daily Tech Digest - September 23, 2019

Artificial intelligence in marketing: when tech converges with the traditional

Artificial intelligence in marketing: when tech converges with the traditional image
The first step is to identify what is meant by a ‘buying coalition’. A buying coalition more accurately reflects the temporary alliance of distinct personas that are assembled to make a purchasing decision. Before, organisations don’t have automated contact creation, and it is only available in the sales reps mind. Now, organisations are able to more accurately target the right buyers at the right time. By applying AI to CRM data, organisations are able to create full contacts for all potential buyers engaged in the decision-making process by capturing relevant activity and associating with the correct opportunity. AI can then track all touch points with various opportunity contacts, analyse how they were engaged and who was engaged before and after them to show both the optimal number of buyers needed to close a deal and also how to sequence and communicate to these buyers in order to build a strategic coalition. This can only be done once AI has mapped the CRM data to the correct opportunity. We’re able to do this with our own persona-driven analysis as well.

Navigating the .NET Ecosystem

While it’s easy to get caught up in the past, and grumble over previous concerns and frustrations, we must move forward. Perhaps, arguably one of the most logical paths forward is to unify .NET Core and .NET Framework ... dare I say, "Let’s make .NET great again!" Maybe, I’ve gone too far, but let’s discuss the future. Where is Microsoft steering us? Let’s take a step back for a moment and discuss where we’ve come from, before diving into where we’re going. Not all .NET developers are aware of how their code compiles, and what is truly produces. "From the very beginning, .NET has relied on a just-in-time (JIT) compiler to translate Intermediate Language (IL) code to optimized machine code." — Richard Lander Revisiting my earlier mention of the Mono project, we know there have been significant efforts around making an ahead-of-time (AOT) compilation for .NET. Mono has achieved with its industry-leading LLVM compiler infrastructure. 

Since last year, Google has been working with online developer education company Udacity to provide free lessons and has now packaged them as 'codelab courses that are formatted like tutorials'.  The courses are aimed at developers who have some experience in programming object-oriented, statically typed languages like Java or C# and who've used IDEs such as JetBrains' IntelliJ IDEA, Android Studio, Eclipse, or Microsoft's Visual Studio. Students will need to install the Java Development Kit (JDK) and IntelliJ. Google promotes Kotlin as a "concise" and "modern object-oriented language [that] offers a strong type system, type inference, null safety, properties, lambdas, extensions, coroutines, higher-order functions".  It started offering free courses last year via the Kotlin Bootcamp course and is now offering them in the Google Developers Codelabs format. "Google and Udacity currently offer video-based courses for Kotlin Bootcamp and How to build Android apps in Kotlin," said Jocelyn Becker, senior program manager of Google Developer Training.

Your competitive edge: Unlock new possibilities with Edge computing

Today more than ever, there is little tolerance from employees, clients, or consumers when it comes to IT failures - fortunately we are entering a new age of technologies that will minimize this risk. Software defined networks (SDN) are becoming increasingly adept at identifying specific customer needs and running workloads to locations that best serve specific cost, latency and security requirements. When combined with the potential of AI to inform decision making, we can see a perfect storm of capabilities that will deliver a step change in how computing and networks converge to deliver a personalized service to customers who embrace these technologies. The added capability of MEC provides an extra layer, protecting essential services from outages or connectivity issues stemming from rare, but damaging ISP failures or cloud server downtime. This approach to network architecture can also enable local hosting of data, allowing data privacy to be better governed.

New application security risks lead IT teams to DevSecOps

This type of organizational change doesn't come easily, but progress is possible, Ewert said. "When I started four years ago, there was a real 'us vs. them' mentality between developers and security," he said. "It took us three years to get the culture to the point where they approach us [for help]." That shift came about through training sessions and explanations for why certain security rules and practices are necessary, Ewert said. The SecOps team also found unobtrusive ways to dovetail its efforts with the rest of the organization's; they piggybacked on an IT monitoring project that rolled out standardized monitoring agents across the organization to get security monitoring agents installed, for example. It has also been helpful for Ewert's team to frame security recommendations in terms of how they can make the application and infrastructure more efficient, such as by avoiding costly distributed denial-of-service attacks. "We only block a release if it's absolutely critical.

Data Everywhere At the Edge of Compute & Energy Efficiency

In traditional or classic software development, or SW 1.0, a typical project for a software team may involve creating algorithms and writing millions of lines of instruction code. SW 2.0 is a new way of thinking in which value creation comes not from code writing, but from data curation. We collect data from our devices, select relevant data sets, verify and label them. We then use that “curated” data to train machine learning (ML) models. Imagine the data involved in running a large printing press. Curating data from a printing press might include engineers looking at images of final press output, finding defects (e.g., lines, splotches, roller marks, etc.), then training an ML model to detect those defects. The model “runs” in real-time at the press to monitor output. With SW 2.0, simple problems are quickly trained, and issues that might otherwise be impossible just take a bit longer.

Enterprises can hire software robots at the push of a button

The IT services roles are internal, providing services to IT teams, but most of the others in banking, insurance and retail are customer-facing, with bots being the first point of contact for customers making enquiries. Dube said about 112,000 customer calls a day are handled by IPsoft customer Telefonica in Peru using Amelia. “These calls are made when people have a problem with their account,” he said. “Humans have to be liberated from these chores.” Dube described the jobs being done by robots as “high-friction, low-margin roles”. They are jobs that have to be done, have high costs, but businesses do not make money out of them. Firms are replacing large numbers of staff in such roles with software robots. The World Economic Forum’s Future of jobs 2018 report predicted that 75 million current jobs will be automated by 2022, and that 52% of jobs today will be done by robots by 2025.

Interview with Scott Hunter on .NET Core 3.0

Many times when we talk about NET Core 3.0, we talk about the new desktop support, but there is also a lot of innovation in ASP.NET. First up, while we are not bringing back WCF, we know that many developers want to program high-performance, contract-based RPC services in their applications. We are embracing the open-source gRPC project for these workloads. We are working to make the .NET implementation first class, and because it is gRPC, it works with many other programming languages as well. There is a new microservices-related Worker Service project for building lightweight background workers, which can be run under orchestrators like Kubernetes. Also, while ASP.NET has great support for building API’s, we want to make it easy to put rich security on top of them, so we are adding bridges to use our API’s with the open-source Identity Server project. Finally, we are working on Blazor, which enables developers to build high-performance web application using .NET in both the browser and server, using Web Assembly.

Innovation: How to get your great ideas approved

Historical precedent suggests that a top down approach to innovation can produce big benefits. Harvard Business Review (HBR) tracked the inventing history of 935 CEOs at publicly-listed US high-tech companies and found that one in five of these successful firms has what it refers to as an inventor CEO, who are bosses that have invented at least one patent. While the research is focused on the high-tech sector, HBR concludes that boards of directors should pay close attention to the inventor credentials of their executive teams. In the case of RBS, Hanley believes the senior team's role is to help point out opportunities and ensure any innovations help the business keep pace with a fast-changing finance market. "I guess our job is to not only think about the future, but almost work backwards from the future – we need to understand and have a point of view as to how we think the world is changing," says Hanley. "We need to articulate that view to all of our key stakeholders and then make sure we do something about it."

Cloud security: Weighing up the risk to enterprises

A good analogy is that clouds are like roads; they facilitate getting to your destination, which could be a network location, application or a development environment. No-one would enforce a single, rigid set of rules and regulations for all roads – many factors come into play, whether it’s volume of traffic, the likelihood of an accident, safety measures, or requirements for cameras. If all roads carried a 30 mile per hour limit, you might reduce fatal collisions, but motorways would cease to be efficient. Equally, if you applied a 70 mile per hour limit to a pedestrian precinct, unnecessary risks would be introduced. Context is very important – imperative in fact. The same goes for cloud computing. To assess cloud risk, it is vital that we define what cloud means. Cloud adoption continues to grow, and as it does, such an explicit delineation of cloud and on-premise will not be necessary. Is the world of commodity computing displacing traditional datacentre models to such an extent that soon all computing will be elastic, distributed and based on virtualisation? 

Quote for the day:

"Leadership does not always wear the harness of compromise." -- Woodrow Wilson

No comments:

Post a Comment