Besides looking for the next technologies for a company to use, a CTO that succeeds in the modern landscape must have money management skills. Possessing them is particularly important when leading a startup due to the typical financial constraints associated with a newer company. However, keeping a careful watch on tech-related spending is a crucial part of a CTO’s role, regardless of the age of the company. Gartner predicted that worldwide IT spending will grow by 3.7% in 2020. Nevertheless, the potential to invest in new technology doesn’t exist if the company has a perpetually maxed-out or mismanaged budget. Whether being wise about expenditures means investigating new cloud providers to find more reasonable rates, or switching to a yearly billed plan for a team file storing tool to save money compared to the monthly subscription, CTOs should remain alert for practical ways to slash spending. Staying involved in keeping costs down gives the chief technology officer more freedom to invest in new technologies at the right time.
Most companies we look at that don’t surge are obsessed with their competitors. They compare pricing, products, marketing initiatives, and, if they can, costs and operating models. They seek to stay one step ahead (or at least not more than a step or two behind) their competitors. They are completely focused on market share. But companies that surge don’t think like that. They don’t look at their competitors, at least not quite so obsessively. They look at their customers, or potential future customers, and focus on how they can provide better value for the customer — profitably. ... Many established companies are risk averse. They adopt risk management processes, report quarterly to shareholders, and are careful not to disturb their market positioning in the eyes of investors. Having the courage to bet the company on a new product or service that fundamentally transforms a business and enables it to grow multiple times larger is rare. Even if the courage can be found, the bet is hard to pull off given the questions that boards and lenders will ask. However, we have found that companies in ASEAN markets can often make bet-the-company decisions quickly because family control means that a close-knit group drives the key decisions.
Open source (and its kissing cousin, free software) guarantees three legal rights: "Free-of-charge use of the software, access to and modification of the source code, and [the ability] to pass on the source code and a binary copy." In turn, the license specifies the obligations the downstream recipient of the software must perform if she modifies the software and then distributes it. ... Today we're running into trouble because that "specificity" Riehle highlights is becoming, well, too specific, with developers blocking certain classes of organizations from using their software. In our fractious and fraught world, this is understandable. Unfortunately, it's not open source, given that non-discrimination is a cardinal virtue of open source licensing. Even so, this debate is far from over, which proves to be one of the great things about open source: Community. We don't always get along, but we're usually willing to talk about it. If legal innovation is the "brain" of open source, community is the "heart." While collaborative development didn't start with open source, open source has done more to codify the practice than anyone or anything else.
As with any addition to your stack, API Gateways introduce another piece to manage. They need to be hosted, scaled, and managed just like the rest of your software. Since all requests and responses must pass through the gateway, they add an additional point of failure and increase the latency of each call by adding a few extra "hops" across the network. Due to their centralized location, it becomes easy to gradually increase the complexity inside the gateway until it becomes a "black box" of code. This makes maintaining the code harder. ... Gateways let clients access services, but what happens when services need to talk to one another? That's where service mesh comes in. A service mesh is a layer focused on service to service communication. You'll see gateway communication described as North-South(from clients to the gateway) and service mesh communication described as East-West(between services). Traditionally it made sense to use a service mesh and API gateway together. The gateway would be the entry point for your client's requests, and then the service mesh would allow your services to rely on one another before passing responses back through the gateway.
In developing their cyber security strategy, aviation businesses need to understand their supply chain and ensure their own cyber security is robust and reliable. They need to know who has access to which systems, and make sure that vendors have the right practices and procedures in place to deal with the cyber threat. There are several steps the industry can take to secure infrastructure, mitigate risk and ensure resilience in the face of the growing cyber threat. To support businesses in this endeavour, the Civil Aviation Authority in the UK recently launched the ASSURE framework. Developed in collaboration with the Council for Registered Ethical Security Testers (CREST), the ASSURE scheme is designed to enable the aviation industry, including airlines, airports and air navigation service providers, to manage cyber security risk without compromising aviation security or resilience. Everything must be done to limit the threat and make it as difficult as possible for attackers to breach the organisation’s security systems. Achieving this cannot be done without IT teams and OT teams working together on cyber security.
The survey shows that higher scoring respondents tend to be more affluent, and show greater engagement with a variety of digital activities. Despite their higher degree of risk exposure, they also exhibit better awareness and increased caution on cyber risks. As a whole, respondents showed a high degree of concerns about data privacy, although half of them are willing to connect through smart devices for better convenience. With regard to the use of financial services, 72 per cent of respondents felt uncomfortable in linking their bank account with a third party app. When it comes to cross-generational analysis, Gen Z received the highest scores in knowledge and attitude, but the lowest in behavior. For Gen X, support is needed to help them build tech-related knowledge, such as how to handle privacy settings, two-factor authentication (2FA) and biometric authentication (BA). Among Gen Y respondents, slightly more of them pay attention to suspicious activity alerts, but they have to address some knowledge and behavioral gaps.
A smart data center can make an e-governance system agile and responsive, while fostering a learning environment and combining best practices, predictive analytics and IT automation. It taps into the power of artificial intelligence (AI) and analytics to achieve positive operational outcomes, optimize cooling and overall data center performance, maximize customer experience, and lower risk and IT costs. While identifying the root cause of issues and their impact on business in minutes, a smart data center can lower the Total Cost of Ownership (TCO) by up to 20% and decrease IT response time by up to 30%, besides providing fast, accurate, contextual, actionable insights on a proactive basis. Moreover, as smart cities unleash the full power of Big Data, IoT, Cloud and streaming services, there is a need for real-time collection and analysis of data on utilities, traffic, security and infrastructure to enable city officials to respond to problems faster than ever before. Hence, there is no room for latency in e-governance services. End users and devices demand anywhere, anytime access to applications and services, and this creates the need for setting up edge data centers for efficient delivery of e-governance services.
In machine learning and artificial intelligence, an important type of problem is called classification. This article describes and compares four of the most commonly used classification techniques: logistic regression, perceptron, support vector machine (SVM), and single hidden layer neural networks. The goal of a classification problem is to predict the value of a variable that can take on discrete values. In binary classification the goal is to predict a variable that can be one of just two possible values, for example predicting the gender of a person (male or female). In multi-class classification the goal is to predict a variable that can be three or more possible values, for example, predicting a person's state of residence (Alabama, Alaska, . . . Wyoming). Note that a regression problem is one where the goal is to predict a numeric value, for example the annual income of a person. There are dozens of ML classification techniques, and most of these techniques have several variations. One way to mentally organize ML classification techniques is to place each into one of three categories: math equation classification techniques, distance and probability classification techniques, and tree classification techniques. This article explains four of the most common math equation classification techniques: A future PureAI article will explain compare common distance and probability techniques
"Instinctively, we feel that greater accuracy is better and all else should be subjected to this overriding goal," said Patrick Bangert, CEO of Algorithmic Technologies. "This is not so. While there are a few tasks for which a change in the second decimal place in accuracy might actually matter, for most tasks this improvement will be irrelevant--especially given that this improvement usually comes at a heavy cost." I get that, but I must confess that I wasn't getting it very well a few years ago, when I was in charge of a financial institution's credit card operation and one of our board members was denied credit at the checkout in a home improvement store because an analytics system issued a false positive and denied him credit. Data science, IT, and business leaders responsible for analytics face the same quandary: To what degree of accuracy must the algorithm operating on the data perform for an analytics program to be declared "ready" for production? The answer depends on the nature of the problem that you're trying to solve. If you're formulating a vaccine, you want to achieve results that exceed 95%. If you're predicting a general trend, the low 90s or even the 80s might suffice.
AI is revolutionizing how organizations digitally transform their security strategies as threats to customers' identities, and personal data continue to proliferate. It's rare to hear any digital transformation strategy prioritize security. BMC's ADE framework is an exception as it recognizes how integral securing customers' identities is a core part of delivering positive customer experience. Organizations are turning to the Zero Trust Security (ZTS) framework to secure every network, cloud, and on-premise platform, operating system, and application across their supply chain and production networks. Chase Cunningham of Forrester, Principal Analyst, is the leading authority on Zero Trust Security, and his recent video, Zero Trust In Action, is worth watching to learn more about how manufacturers can secure their IT infrastructures. You can find his blog here. There are several fascinating companies to watch in this area, including MobileIron, which has created a mobile-centric, zero-trust enterprise security framework manufacturers are relying on today.
Quote for the day:
“Five years down the line, all of our devices will have an emotion chip. We won’t remember when we couldn't just frown at a device” -- Rana El Kaliouby