Showing posts with label marketing. Show all posts
Showing posts with label marketing. Show all posts

Daily Tech Digest - March 08, 2024

What is the cost of not doing enterprise architecture?

Without an EA, an organisation may struggle to show how its IT projects and technology decisions align with its business goals, leading to initiatives that do not support the overall business strategy or deliver optimal value. A company favouring growth through acquisition should be buying systems and negotiating contracts that support onboarding of more users and more data/transactions without cost increasing significantly. The EA should allow for understanding which processes and technology would be impacted by the strategy, for modelling out the impact and also being used as part of the decision process. Equally, the architecture can consider strategic trends and be designed to support those, for example, bankrupt US retailer, Sears, was slow to adopt e-commerce, allowing competitors to capture the growing online shopping market. ... Your Enterprise Architecture provides a framework for making informed decisions about IT investments and strategies. Without the holistic view that EA offers, decision-makers may lack the full context for their decisions, leading to choices that are suboptimal or that fail to consider the interdependencies and long-term implications for the organisation.


Making Software Development Boring to Deliver Business Value

Boerman argued that software development should become boring. He made the distinction between boring software and exciting software: Boring software in that categorization resembles all software that has been built countless times, and will be so a billion times more. In this context, I am specifically thinking about back-end systems, though this rings true for front-end systems as well. Exciting software is all the projects that require creativity to build. Think about purpose-built algorithms, automations, AI integrations, and the like. Making software development boring again is about laying a prime focus on delivering business value, and making the delivery of these aspects predictable and repeatable, Boerman argued. This requires moving infrastructure out of the way in such a way that it is still there, but does not burden the day-to-day development process: While infrastructure takes most of the development time, it technically delivers the least amount of business value, which can be found in the data and the operations executed against it. New exciting experiments may be fast-moving and unstable, while the boring core is meant to be and remain of high quality such that it can withstand outside disruptions, Boerman concluded.


New TDWI Assessment Examines the State of Data Quality Maturity Today

“With data becoming such a critical part of a business’s ability to compete, it’s no wonder there’s a growing emphasis on data quality,” Halper began. “Organizations need better and faster insights in order to succeed, and for that they need better, more enriched data sets for advanced analytics -- such as predictive analytics and machine learning.” She explained that to do this, organizations are not only increasing the amount of traditional, structured data they’re collecting, they’re also looking for newer data types, such as unstructured text data or semistructured data from websites. Taken together, these various types of data can offer significantly more opportunities for insights, she added. As an example, Halper mentioned the idea of an organization using notes from its call center -- typically unstructured or semistructured text data -- to analyze customer satisfaction, either with a particular product or with the company as a whole. This information can then be fed back into an analytics or machine learning routine and reveal patterns or other insights meaningful to the company. “Regardless of the type of data or its end use,” she said, “the original data must be high quality. It must be accurate, complete, timely, trustworthy, and fit for purpose.”


The Five Biggest Challenges with Large-Scale Cloud Migrations

Several issues can arise when attempting to migrate legacy systems to the cloud. The system may not be optimized for cloud performance and scalability, so it is important to develop and implement solutions that boost the system’s speed and capacity to get the most from the cloud migration. Other issues common with legacy system integration include data security, data integrity, and cost management. The latter is often a particular concern because companies may also be required to pay for training and maintenance in addition to the cost of migration. ... The risks of migrating data to the cloud include data security, data corruption, and excessive downtime, which can cost money and negatively impact performance. To optimize migration success and minimize downtime, it is vital for companies to understand the amount of data involved and the bandwidth necessary to complete the transfer with minimal work disruption. ... Due to poor infrastructure and configuration, many companies cannot take advantage of the benefits of cloud computing. Often, companies fail to maximize the move from fixed infrastructure to scalable and dynamic cloud resources.


Getting the BELT: Empowering Executive Leadership in Data Governance

The active engagement of the ELT in the data governance process is critical not only for setting a strategic direction, but also for catalyzing a shift in organizational mindset. By championing the principles of NIDG, the ELT paves the way for a governance model that is both effective and sustainable. This leadership commitment helps in breaking down silos, promoting cross-departmental collaboration, and establishing a shared vision that recognizes data as a pivotal asset. Through their actions and decisions, executive leaders serve as role models, demonstrating the value of data governance and encouraging a culture of continuous improvement. Their involvement ensures that data governance initiatives are aligned with business strategies, driving the organization toward achieving its goals while maintaining data integrity and compliance. ... The journey towards effective data governance begins with buy-in, not just from the ELT, but across the entire organization. Achieving this requires the ELT to understand the strategic importance of data governance and to communicate this value convincingly. 


Going passwordless with passkeys in Windows and .NET

Passkeys managed by Windows Hello are “device-bound passkeys” tied to your PC. Windows can support other passkeys, for example passkeys stored on a nearby smartphone or on a modern security token. There’s even the option of using third parties to provide and manage passkeys, for example via a banking app or a web service. Windows passkey support allows you to save keys on third-party devices. You can use a QR code to transfer the passkey data to the device, or if it’s a linked Android smartphone, you can transfer it over a local wireless connection. In both cases the devices need a biometric identity sensor and secure storage. As an alternative, Windows will work with FIDO2-ready security keys, storing passkeys on a YubiKey or similar device. A Windows Security dialog helps you choose where to save your keys and how. If you’re saving the key on Windows, you’ll be asked to verify your identity using Windows Hello before the device is saved locally. If you’re using Windows 11 22H2 or later, you can manage passkeys through Windows settings.


Generative AI on its own will not improve the customer experience

Businesses around the world hope that, beyond the hype of generative AI, there lies a near-term path to improving business efficiency and in parallel a longer-term ability to grow revenue. There is one, not insignificant, consideration to weigh before the true savings can be measured. In 2024, as in 2023, generative AI and ChatGPT both trail "Customer Service / Telephone number" as search terms on Google in most countries. Most of those searches involve a quest by a customer to reach a human being. There is great frustration because most businesses are working hard to make it difficult to reach a person. This gap between the corporate commitment to removing the human connection in customer service and the customer's desire for a human connection almost always points to a bad business process. The business must examine why the customer doesn't use the self-service channel. This discovery process is a precursor to deeper self-service powered by generative AI. Our first recommendation is to step back and ensure the customer service process you want to supercharge with generative AI satisfies customers. 


How continuous SDL can help you build more secure software

Beyond making the SDL automated, data-driven, and transparent, Microsoft is also focused on modernizing the practices that the SDL is built on to keep up with changing technologies and ensure our products and services are secure by design and by default. In 2023, six new requirements were introduced, six were retired, and 19 received major updates. We’re investing in new threat modeling capabilities, accelerating the adoption of new memory-safe languages, and focusing on securing open-source software and the software supply chain. We’re committed to providing continued assurance to open-source software security, measuring and monitoring open-source code repositories to ensure vulnerabilities are identified and remediated on a continuous basis. Microsoft is also dedicated to bringing responsible AI into the SDL, incorporating AI into our security tooling to help developers identify and fix vulnerabilities faster. We’ve built new capabilities like the AI Red Team to find and fix vulnerabilities in AI systems. By introducing modernized practices into the SDL, we can stay ahead of attacker innovation, designing faster defenses that protect against new classes of vulnerabilities.


Rethinking SDLC security and governance: A new paradigm with identity at the forefront

Poorly governed identities have become a gateway for substantial incidents. High-profile breaches at companies like LastPass and Okta have illuminated the attackers' method: exploiting the identity attack vector to orchestrate some of the most notable breaches, using compromised accounts to potentially alter source code and extract valuable information. These events underscore a clear and present trend of identity theft through phishing or ransomware attacks, which then pave the way for attackers to infiltrate the software development lifecycle (SDLC), leading to the insertion of malicious code and the theft of data. Despite the clear risks, organizations continue to fumble in securing and managing these identities, making it the riskiest yet most overlooked attack vector facing SDLC security and governance today. As we pivot to address this critical oversight, it's imperative to understand the role of identity within the SDLC. The “Inverted Pyramid" analogy is a useful conceptual framework that captures the essence of the old and new paradigms and how reorienting our approach can better protect against these insidious threats.


Analyzing the CEO–CMO relationship and its effect on growth

It’s estimated that only 10 percent of Fortune 250 CEOs have marketing experience. There’s also a dramatic acceleration of digital technology in the world of marketing. We’re no longer judging marketing by television commercials. There’s a whole slew of different components to think through. And the data piece that you hinted at is that these customers’ signals are now everywhere. It’s incumbent upon us as marketers to interpret them and feed them back to our organizations in such a way that we don’t talk about data but we talk about insights and are able to connect the dots. ... As we come up with a means to measure marketing, the CEO or CFO needs to learn the measurement systems in place to understand what it means when I cut budget, what it means when I invest in it, and how we tie those activities to outcomes. That robust measurement system can help you understand your brand, how your customers perceive your brand, and what level of fidelity they give you credit for. That’s where the brand scores are really helpful. But you also need an econometric model to connect how the money you’re spending on different channels such as video, content, and search—all working in tandem—helps create the results you want.



Quote for the day:

"Success is the sum of small efforts, repeated day-in and day-out." -- Robert Collier

Daily Tech Digest - September 23, 2019

Artificial intelligence in marketing: when tech converges with the traditional

Artificial intelligence in marketing: when tech converges with the traditional image
The first step is to identify what is meant by a ‘buying coalition’. A buying coalition more accurately reflects the temporary alliance of distinct personas that are assembled to make a purchasing decision. Before, organisations don’t have automated contact creation, and it is only available in the sales reps mind. Now, organisations are able to more accurately target the right buyers at the right time. By applying AI to CRM data, organisations are able to create full contacts for all potential buyers engaged in the decision-making process by capturing relevant activity and associating with the correct opportunity. AI can then track all touch points with various opportunity contacts, analyse how they were engaged and who was engaged before and after them to show both the optimal number of buyers needed to close a deal and also how to sequence and communicate to these buyers in order to build a strategic coalition. This can only be done once AI has mapped the CRM data to the correct opportunity. We’re able to do this with our own persona-driven analysis as well.



Navigating the .NET Ecosystem

While it’s easy to get caught up in the past, and grumble over previous concerns and frustrations, we must move forward. Perhaps, arguably one of the most logical paths forward is to unify .NET Core and .NET Framework ... dare I say, "Let’s make .NET great again!" Maybe, I’ve gone too far, but let’s discuss the future. Where is Microsoft steering us? Let’s take a step back for a moment and discuss where we’ve come from, before diving into where we’re going. Not all .NET developers are aware of how their code compiles, and what is truly produces. "From the very beginning, .NET has relied on a just-in-time (JIT) compiler to translate Intermediate Language (IL) code to optimized machine code." — Richard Lander Revisiting my earlier mention of the Mono project, we know there have been significant efforts around making an ahead-of-time (AOT) compilation for .NET. Mono has achieved with its industry-leading LLVM compiler infrastructure. 



Since last year, Google has been working with online developer education company Udacity to provide free lessons and has now packaged them as 'codelab courses that are formatted like tutorials'.  The courses are aimed at developers who have some experience in programming object-oriented, statically typed languages like Java or C# and who've used IDEs such as JetBrains' IntelliJ IDEA, Android Studio, Eclipse, or Microsoft's Visual Studio. Students will need to install the Java Development Kit (JDK) and IntelliJ. Google promotes Kotlin as a "concise" and "modern object-oriented language [that] offers a strong type system, type inference, null safety, properties, lambdas, extensions, coroutines, higher-order functions".  It started offering free courses last year via the Kotlin Bootcamp course and is now offering them in the Google Developers Codelabs format. "Google and Udacity currently offer video-based courses for Kotlin Bootcamp and How to build Android apps in Kotlin," said Jocelyn Becker, senior program manager of Google Developer Training.



Your competitive edge: Unlock new possibilities with Edge computing

Today more than ever, there is little tolerance from employees, clients, or consumers when it comes to IT failures - fortunately we are entering a new age of technologies that will minimize this risk. Software defined networks (SDN) are becoming increasingly adept at identifying specific customer needs and running workloads to locations that best serve specific cost, latency and security requirements. When combined with the potential of AI to inform decision making, we can see a perfect storm of capabilities that will deliver a step change in how computing and networks converge to deliver a personalized service to customers who embrace these technologies. The added capability of MEC provides an extra layer, protecting essential services from outages or connectivity issues stemming from rare, but damaging ISP failures or cloud server downtime. This approach to network architecture can also enable local hosting of data, allowing data privacy to be better governed.


New application security risks lead IT teams to DevSecOps


This type of organizational change doesn't come easily, but progress is possible, Ewert said. "When I started four years ago, there was a real 'us vs. them' mentality between developers and security," he said. "It took us three years to get the culture to the point where they approach us [for help]." That shift came about through training sessions and explanations for why certain security rules and practices are necessary, Ewert said. The SecOps team also found unobtrusive ways to dovetail its efforts with the rest of the organization's; they piggybacked on an IT monitoring project that rolled out standardized monitoring agents across the organization to get security monitoring agents installed, for example. It has also been helpful for Ewert's team to frame security recommendations in terms of how they can make the application and infrastructure more efficient, such as by avoiding costly distributed denial-of-service attacks. "We only block a release if it's absolutely critical.


Data Everywhere At the Edge of Compute & Energy Efficiency

In traditional or classic software development, or SW 1.0, a typical project for a software team may involve creating algorithms and writing millions of lines of instruction code. SW 2.0 is a new way of thinking in which value creation comes not from code writing, but from data curation. We collect data from our devices, select relevant data sets, verify and label them. We then use that “curated” data to train machine learning (ML) models. Imagine the data involved in running a large printing press. Curating data from a printing press might include engineers looking at images of final press output, finding defects (e.g., lines, splotches, roller marks, etc.), then training an ML model to detect those defects. The model “runs” in real-time at the press to monitor output. With SW 2.0, simple problems are quickly trained, and issues that might otherwise be impossible just take a bit longer.


Enterprises can hire software robots at the push of a button


The IT services roles are internal, providing services to IT teams, but most of the others in banking, insurance and retail are customer-facing, with bots being the first point of contact for customers making enquiries. Dube said about 112,000 customer calls a day are handled by IPsoft customer Telefonica in Peru using Amelia. “These calls are made when people have a problem with their account,” he said. “Humans have to be liberated from these chores.” Dube described the jobs being done by robots as “high-friction, low-margin roles”. They are jobs that have to be done, have high costs, but businesses do not make money out of them. Firms are replacing large numbers of staff in such roles with software robots. The World Economic Forum’s Future of jobs 2018 report predicted that 75 million current jobs will be automated by 2022, and that 52% of jobs today will be done by robots by 2025.



Interview with Scott Hunter on .NET Core 3.0

Many times when we talk about NET Core 3.0, we talk about the new desktop support, but there is also a lot of innovation in ASP.NET. First up, while we are not bringing back WCF, we know that many developers want to program high-performance, contract-based RPC services in their applications. We are embracing the open-source gRPC project for these workloads. We are working to make the .NET implementation first class, and because it is gRPC, it works with many other programming languages as well. There is a new microservices-related Worker Service project for building lightweight background workers, which can be run under orchestrators like Kubernetes. Also, while ASP.NET has great support for building API’s, we want to make it easy to put rich security on top of them, so we are adding bridges to use our API’s with the open-source Identity Server project. Finally, we are working on Blazor, which enables developers to build high-performance web application using .NET in both the browser and server, using Web Assembly.


Innovation: How to get your great ideas approved


Historical precedent suggests that a top down approach to innovation can produce big benefits. Harvard Business Review (HBR) tracked the inventing history of 935 CEOs at publicly-listed US high-tech companies and found that one in five of these successful firms has what it refers to as an inventor CEO, who are bosses that have invented at least one patent. While the research is focused on the high-tech sector, HBR concludes that boards of directors should pay close attention to the inventor credentials of their executive teams. In the case of RBS, Hanley believes the senior team's role is to help point out opportunities and ensure any innovations help the business keep pace with a fast-changing finance market. "I guess our job is to not only think about the future, but almost work backwards from the future – we need to understand and have a point of view as to how we think the world is changing," says Hanley. "We need to articulate that view to all of our key stakeholders and then make sure we do something about it."


Cloud security: Weighing up the risk to enterprises

A good analogy is that clouds are like roads; they facilitate getting to your destination, which could be a network location, application or a development environment. No-one would enforce a single, rigid set of rules and regulations for all roads – many factors come into play, whether it’s volume of traffic, the likelihood of an accident, safety measures, or requirements for cameras. If all roads carried a 30 mile per hour limit, you might reduce fatal collisions, but motorways would cease to be efficient. Equally, if you applied a 70 mile per hour limit to a pedestrian precinct, unnecessary risks would be introduced. Context is very important – imperative in fact. The same goes for cloud computing. To assess cloud risk, it is vital that we define what cloud means. Cloud adoption continues to grow, and as it does, such an explicit delineation of cloud and on-premise will not be necessary. Is the world of commodity computing displacing traditional datacentre models to such an extent that soon all computing will be elastic, distributed and based on virtualisation? 



Quote for the day:


"Leadership does not always wear the harness of compromise." -- Woodrow Wilson


Daily Tech Digest - September 02, 2018

Strategies for Improving Smart City Logistics

Strategies for Improving Smart City LogisticsEfficient, timely and accurate delivery is a necessity to retailers and logistics providers survival in an Amazon Prime world. Smart Cities goals of livability and sustainability means they want less trucks, congestion and pollution. For all stakeholders to achieve their goals, the only answer is to work together. If cities, retailers, and logistics providers work together, collaboration and digital solutions can help resolve traditional challenges of last-mile logistics and improve the livability and sustainability of cities. ... In Europe, where they have higher urbanization, more aggressive goals for CO2 reduction, and the width of the streets in its older cities are less equipped to handle a rise in urban freight transport, there have been many initiatives and cities working on this issue. The European Union has been co-funding and working together more collaboratively with cities and partners such as logistics companies like TNT and DHL, as well as, local retailers in the creation of consolidation centers and more sophisticated delivery practices.


Bank Products Are Dead: Long Live Experiences


By 2020 we’re going to see 50 billion new devices connected to the Internet — everything will be smart. Smart Fridges that order your groceries or can tell you what you can cook with the remaining items inside, sensors you wear on your wrist or in your clothes that monitor your health and activity, cars that will talk to each other and drive themselves, smart mirrors that will show you how you look in that new shirt, robot drones and pods that will deliver you groceries or Amazon order — the world will be filled with smart stuff. We live in a world where new technology emerges and is adopted in months today, versus the years it took previously. It’s all moving so quickly. As more and more technology is injected into our lives, we become acclimatized and just accept the increased role technology has to play. This is known as technology, adoption diffusion. As we move to this technology-optimized world, we’ll start to redesign where and how humans fit in society. Banking will be embedded in our life.


This mind-reading AI can see what you're thinking - and draw a picture of it

Chilean software engineer Jorge Alviarez, one of the creators of Lifeware's program called LifewareIntegra that allows handicapped people to use computers, places head sensors on Jenifer Astorga (26), who suffers from quadriplegia, during a training session for her in Valparaiso city, about 75 miles (121 km) northwest of Santiago, January 18, 2011. Jenifer is the first to use the LifewareIntegra system developed by a group of computer science students at the Federico Santa Maria Technical University that permits quadriplegics to use a computer through brain activity picked up by sensors on the head device. REUTERS/Eliseo Fernandez (CHILE - Tags: SCI TECH EDUCATION SOCIETY)
While headlines around the world have screamed out that AI can now read minds, the reality seems to be more prosaic. Computers are not yet able to anticipate what we think, feel or desire. As science writer Anjana Ahuja remarked in the Financial Times, rather than telepathy, “a more accurate, though less catchy, description would be a ‘reconstruction of visual field’ algorithm”. Most of the research so far has been aimed at deciphering images of what subjects are looking at or, in limited circumstances, what they are thinking about. Studies have previously focused on programs producing images based on shapes or letters they had been taught to recognize when viewed through subjects’ minds. However, in one recent piece of research, from Japan’s ATR Computational Neuroscience Laboratories and Kyoto University, scientists said that not only was a program able to decipher images it had been trained to recognize when people looked at them but: “our method successfully generalized the reconstruction to artificial shapes, indicating that our model indeed ‘reconstructs’ or ‘generates’ images from brain activity, not simply matches to exemplars.”


Microsoft officially christens 'Redstone 5' as the Windows 10 October 2018 Update

windows10october2018update.jpg
The October 2018 Update rollout will likely be staggered, as in past feature releases, with machines known to be able to best handle the new bits getting them pushed to them first. Microsoft also will likely begin rolling out the server complements to the October 2018 Update -- Windows Server 1809 and Windows Server 2019 -- on the same day in October as the client build goes live. The part of today's announcement that is a bit more surprising is that Microsoft is still saying that the October 2018 Update will be going to the "nearly 700 million devices" running Windows 10. Microsoft has been using this same 700 million figure since March 2018 and hasn't provided an updated momentum figure. ... The Windows 10 October 2018 Update will include the Cloud Clipboard, dark-mode File Explorer option, a number of new Notepad features and other tweaks and updates. It also will deliver a number of new security and enterprise features, as well as a new Windows 10 Enterprise Remote Sessions edition. Microsoft will likely detail these enterprise features at its Ignite show.


Want To Survive & Thrive With AI?…Then Mind The Skills Gap

“The battle for diversity is vital, just from the perspective of finding the best talent in the widest possible pool. Demystifying the idea that AI is something very difficult is crucial, you do not need to code like Sergey Brin, the co-founder of Google. Being unafraid of a strange discipline is key. There is a huge gap between STEM and the arts and we need each other,” says Dr Lauterbach. ... “The phrase Artificial Intelligence is misleading because everything happens by human design. Human beings pick big data sets, algorithms, methodology and processing hardware.” According to Dr Lauterbach, if algorithms are not created to be inclusive, they could contribute to inequalities and thus would not be effective in helping the world. “AI has a capability to scale everything we are about as humans,” she says. “So if you have a team of only white male developers or only Chinese male developers, then you will get a data set or some algorithms that are wired according to the preferences, habits and thinking processes of those groups.”


The Modern Marketing Model for the Financial Industry


When we consider the new complexities of modern financial services marketing, it is best to integrate both traditional and digital marketing in a manner that achieves synergistic benefits. By fusing together both classical and digital marketing, organizations are in a better position to identify capability gaps placing a focus on where and how to move forward. The chart below from eConsultancy helps to visualize the required components. This model is a natural progression from previous models used by marketers. For instance, in the 1960s, the prevalent marketing model was the ‘4Ps’ (Product, Price, Place and Promotion). In the 1980s, there were three additional Ps added (People, Process and Physical) reflecting increased customer interaction and the beginning of targeting. In the 1990s, ROI entered the equation, as did the ongoing increase in importance of targeting (the ‘4Cs’ included Consumer, Cost, Communication and Convenience). The new marketing model highlights the importance of customer insight, analytics, brand and customer experience.


7 factors that will push implementation of AI in healthcare


Because artificial neural networks of deep learning mirror the brain’s ability to learn difficult patterns, Hinton noted that the networks also model complicated between inputs and outputs used for predicting future medical events from past events or large data sets.  “As data sets get bigger and computers become more powerful, the results achieved by deep learning will get better, even with no improvement in the basic learning techniques, although these techniques are being improved,” Hinton wrote. A remaining challenge artificial intelligence has yet to overcome, Hinton wrote, is detecting patterns in unlabeled data in the process called “unsupervised learning."  “As new unsupervised learning algorithms are discovered, the data efficiency of deep learning will be greatly augmented in the years ahead, and its potential applications in healthcare and other fields will increase rapidly,” according to Hinton.  Overall, clinicians and physicians should be aware of the challenges that come with implementing AI and deep learning into everyday workflow and know how to efficiently approach it


 web-based cryptojacking
By taking as an example the 10 most profitable sites that hold mining code, the researchers estimated that they are able to generate between 0.53 and 1.51 Monero per day, i.e., between 119 to 340 USD (at the time). While it’s not much, given that the revenue is achieved without any cost to the miner, this is still a notable profit. “However, we conclude that current cryptojacking is not as profitable as one might expect and the overall revenue is moderate,” the researchers noted. How to stop it? The researchers found that existing blacklist-based approaches used by web browsers are trivial to evade and the actual lists outdate fast. Instead of static blacklists, they leveraged a set of heuristic indicators for candidate selection and a dedicated performance measurement step for precise miner identification. But, however suitable this approach is, they pointed out that it likely works well only because today’s mining operators don’t anticipate it. As the only reliable indicator of active mining is prolonged and excessive CPU usage, their advice for browser makers is to implement CPU allotments for tabs.


artificial intelligence / machine learning Another sticking point the panel discussed was the issue of maturity. That is, organizations have to ask themselves whether they truly have the ability to define, develop and manage their AI investments in a way that will create value. After all, AI isn’t some piece of plug-and-play software you can just flip on and start using. There are significant process changes that need to occur, in technology systems and human employees alike. Security should also be of chief concern. AI’s impact on security can be profound, which means you must determine what controls and protections will be necessary from the very beginning to ensure your sensitive data (sources and outcomes) remain secure. When there’s confusion and disagreement over how to proceed, it can lead to a case of analysis paralysis. So before charging full steam ahead with AI, companies should realistically assess their own readiness to do so. Thankfully, the IP Soft AI Pioneers Forum is now working to develop a universal AI maturity model that may be helpful to companies in these cases.


Focusing on machine learning 2020: augmentation instead of automation


The holy grail of augmentation can be easily seen as the pursuit of creativity but there are many other areas of interest as well. Strategic decision making, such as choosing where to build new skyscrapers, where to build new infrastructure (bridges, roads, facilities), what type of aircraft should we buy to maximize profitability and growth and what routes should we fly —counting in sustainability. These questions are still largely thought out with excel sheets, BI-tools and GIS-systems, and maybe some legacy statistics software (SAS, SPSS) with some custom analysis. While that may be sufficient for some industries, many of these problems have so many attributes that it’s impossible for us as humans to make optimal decisions — hence welcoming optimization and machine learning to help as augmenting features of decision making. And despite the fact that it’s still quite early to tell, deep learning may well be of use here



Quote for the day:

"Becoming a leader is synonymous with becoming yourself. It is precisely that simple, and it is also that difficult." -- Warren G. Bennis

Daily Tech Digest - July 31, 2018

How disaster recovery can serve as a strategic tool

life preserver - personal floatation device
“You can count on us” is a popular business mantra. But what does that mean exactly? Consider this thought experiment: You and a competitor are hit with the same incident, but one of you gets back up more quickly. Fast recovery will give you a competitive advantage, if you can pay the price. “The smaller your RTO and RPO values are, the more your applications will cost to run,” says Google Cloud in a how-to discussion of DR. Any solution should also be well tested. “Your customers expect your systems to be online 24x7,” says Scott Woodgate, director, Microsoft Azure, in this press release. ... A solid DR plan can also facilitate transformational-based efficiencies. Let's say your leadership has business reasons for migrating to a new data center or transitioning to a hybrid cloud. Part of planning a migration is prepping for user experience and systems being down. If you are willing to use your DR assets during the transition, once the cloud or new physical sites are ready, you can fail back from DR, thus minimizing disruption. As an IT pro, you may not want to define these events as disasters, but business leaders prefer using existing resources to investing in swing gear.



The cybersecurity incident response team: the new vital business team

null
We live and do business in a world fraught with cyber risks. Every day, companies and consumers are targeted with attacks of varying sophistication, and it has become increasingly apparent that everyone is considered fair game. Organisations of all sizes and industries are falling victim, and the cyber risk is quickly becoming one of the most prevalent threats. When disruptions do occur from cyberattacks or other data incidents they not only have a direct financial impact, but an ongoing effect on reputation. For example, Carphone Warehouse fell victim to a cyberattack in 2015, which resulted in the compromising of data belonging to more than three million customers and 1,000 employees. While it suffered financial losses from the remedial costs, which included a £400,000 fine from the Information Commissioner’s Office (ICO), it also led to consumers questioning whether their data was truly secure with the retailer and if it was simply safer to shop elsewhere. That loss in consumer confidence is incredibly difficult to claw back, particularly at a time when grievances can be aired on social media and be shared hundreds or thousands of times.


Managing IoT resources with access control

The first place to start in establishing an effective IoT security strategy is by ensuring that you are able to see and track every device on the network. Issues from patching to monitoring to quarantining all start with establishing visibility from the moment a device touches the network. Access control technologies need to be able to automatically recognize IoT devices, determine if they have been compromised and then provide controlled access based on factors such as the type of device, whether or not it is user-based and, if so, the role of the user. And they need to be able to do this at digital speeds. Another access control factor to consider is location. Access control devices need to be able to determine whether an IoT device is connecting remotely and, if not, where in the network it is logging in from. Different access may be required depending on whether a device is connecting remotely, or even from the lobby, a conference room, a secured lab or a warehouse facility. Location-based access policies are especially relevant for organizations with branch offices or an SD-WAN system in place.


Artificial intelligence: Why a digital base is critical

The adoption of AI, we found, is part of a continuum, the latest stage of investment beyond core and advanced digital technologies. To understand the relationship between a company’s digital capabilities and its ability to deploy the new tools, we looked at the specific technologies at the heart of AI. Our model tested the extent to which underlying clusters of core digital technologies (cloud computing, mobile, and the web) and of more advanced technologies (big data and advanced analytics) affected the likelihood that a company would adopt AI. As Exhibit 1 shows, companies with a strong base in these core areas were statistically more likely to have adopted each of the AI tools—about 30 percent more likely when the two clusters of technologies are combined.5These companies presumably were better able to integrate AI with existing digital technologies, and that gave them a head start. This result is in keeping with what we have learned from our survey work. Seventy-five percent of the companies that adopted AI depended on knowledge gained from applying and mastering existing digital capabilities to do so.


The 5 Clustering Algorithms Data Scientists Need to Know

Clustering is a Machine Learning technique that involves the grouping of data points. Given a set of data points, we can use a clustering algorithm to classify each data point into a specific group. In theory, data points that are in the same group should have similar properties and/or features, while data points in different groups should have highly dissimilar properties and/or features. Clustering is a method of unsupervised learning and is a common technique for statistical data analysis used in many fields. In Data Science, we can use clustering analysis to gain some valuable insights from our data by seeing what groups the data points fall into when we apply a clustering algorithm. Today, we’re going to look at 5 popular clustering algorithms that data scientists need to know and their pros and cons! K-Means is probably the most well know clustering algorithm. It’s taught in a lot of introductory data science and machine learning classes. It’s easy to understand and implement in code! Check out the graphic below for an illustration.



Ransomware Attack Leads to Discovery of Lots More Malware

The investigation concluded the unauthorized persons would have had the ability to access all of the Blue Springs computer systems, the clinic notes. "However, at this time, we have not received any indication that the information has been used by an unauthorized individual." The U.S. Department of Health and Human Service's HIPAA Breach Reporting Tool website, or "wall of shame," indicates that Blue Spring on July 10 reported the breach as a hacking/IT incident involving its electronic medical records and network server that exposed data on nearly 45,000 individuals. Blue Spring's front desk receptionist, who did not want to be identified by name, told Information Security Media Group Friday that the investigation into the ransomware attack had not yet determined the source of the ransomware attack, the source of the other malware discovered, whether the other malware might have been present on the practice's systems before the ransomware attack, or whether the infections were all part of the same attack. She said the practice chose to "rebuild" its systems and did not pay a ransom.


CIOs reveal their security philosophies

 CIOs reveal their security philosophies
“Overly strict security creates a different risk — throttling information exchange and creativity can threaten a company’s competitive viability,” Johnson adds. “Poorly managed reactions to breaches — and all firms have been breached in some way — can lead to other business deterioration.” “Security is as much a human challenge as it is a technical challenge,” he concludes. “Dependable cybersecurity requires a three-part strategy of 1) superb technical implementation of the basics, 2) consistent education aimed at increasing awareness of employees, vendors, and executives, and 3) building a security team that is as motivated, skilled, and innovative as the bad guys.” In this edition of Transformation Nation, CIOs delineate their own IT security philosophies — dispatches from the front lines of cybersecurity strategy. The implications of a breach for corporate reputation, economic well-being, and personal security are immense. Through these accounts, CIOs reveal the many tension points in application and communication that they grapple with every day


GDPR means it is time to revisit your email marketing strategies

No matter how private you think your emails are, every email you send and receive is stored on a remote hard drive you have no control over. If your email provider doesn’t encrypt your emails from end-to-end, (most don’t), all company emails are at risk. Encrypting employee email communications plays a huge role in maintaining GDPR compliance. The average employee won’t think twice about emailing co-workers about sensitive issues that may include data from the business database. For example, someone might send a customer’s credit card information to the sales department for processing a return. To protect your internal emails and maintain GDPR compliance, buying general encryption services isn’t enough. You need to know exactly how and when the data is and isn’t being encrypted. Not all encryption services are complete. For instance, if you’re using Microsoft 365, you’ve probably heard of a data protection product called Azure RMS. This product uses TLS security to encrypt email messages the moment they leave a user’s device. Unfortunately, when the messages reach Microsoft’s servers, they are stored unprotected. 


Google, Cisco amp-up enterprise cloud integration

hybrid cloud
The Cisco/Google combination – which is currently being tested by an early access enterprise customer, according to Google – will let IT managers and application developers use Cisco tools to manage their on-premises environments and link it up with Google’s public IaaS cloud which offers orchestration, security and ties to a vast developer community. In fact the developer community is one area the companies have targeted recently by announcing a Cisco & Google Cloud Challenge, which is offering prizes worth over $160,000 to develop what Cisco calls “game-changing” apps using Cisco’s Container Platform with Google Cloud services. Cisco says the goal is to bring together its DevNet community and Google’s Technology Partners to bring new hybrid-cloud applications for enterprise customers. Cisco VP & CTO of DevNet Susie Wee wrote in a blog that in preparation for the Challenge, DevNet is offering workshops, office hours, and sandboxes using Cisco Container Platform with Google Cloud services to help customers and developers learn how to connect cloud data from a private cloud to the Google Cloud Platform or even data from edge devices to run analytics and employ machine learning.


Why 'Sophisticated' Leadership Matters -- Especially Now


When challenged by complexity, many leaders try to implement best practices such as lean management, restructuring or re-engineering. Such investments may indeed be necessary, but they are rarely sufficient. This is because the root cause of most stalls is that the leader has run up against the limits of his or her leadership sophistication. In other words, the leader is failing to reinvent him- or herself as the new kind of leader the organization now needs. This usually means that the leader doesn’t fully appreciate that intelligence, hard work and technical knowledge must now take a back seat to enhanced personal, interpersonal, political and strategic leadership capabilities. In other words, you will stall not because the complex challenges you face require changes in your organization. But rather because the sophisticated challenges require change in yourself. So how can you become a more sophisticated leader? Try pulling back, elevating your viewpoint and figuring out how you can take yourself to the next level.



Quote for the day:



"Next generation leaders are those who would rather challenge what needs to change and pay the price than remain silent and die on the inside." -- Andy Stanley


Daily Tech Digest - April 22, 2018

New Fraud Statistics Show Rising Volume of Identity Theft

A white mask on a laptop keyboard.
The Cifas data indicated that online retail fraud rose 49 percent last year. According to the report, identity fraud “remains a predominantly internet-based offense, with 84 percent of identity fraud occurring through online application channels.” Account takeover (ATO) fraud is also on the rise, experiencing a 7 percent increase over 2016. A recent Javelin report found that ATO fraud tripled last year, causing more than $5 billion in losses. In addition, the average resolution time for ATO was 16 hours. New account fraud (NAF), meanwhile, rose 70 percent as cybercriminals leveraged personally identifiable information (PII) to create fake credit card and bank accounts. The Cifas report also noted that actors are increasingly targeting older age groups for ATO fraud using social engineering techniques. These often take the form of phishing emails or over-the-phone “security checks” that ask victims to provide personal information for “verification.” Once attackers have PII in hand, they’re able to either compromise existing accounts or create new ones that may lead to claims of credit fraud or identity theft.



'WordPress of Blockchain' Startup Seeks to Solve Enterprise Pain Points

watches
The Federated Network Protocol is aware of the number of validators, and their health, at all times. This awareness allows Hadron to predict the point of failure on the network and prevent it by spinning up temporary validators that keep the network alive while participants are alerted to the imbalance and instructed to remedy it. In this way, Dukkipatty said, the blockchains that use Elemential (which has designed its middleware for Hyperledger Fabric, Corda, Tendermint and private instances of ethereum) can continue working even when a problem arises. Currently, Elemential is working with the National Stock Exchange of India on a know-your-customer (KYC ) compliance scheme that's built on a private blockchain. The pilot includes ICICI Bank, IDFC Bank, Kotak Mahindra Bank, IndusInd Bank and RBL Bank, as well as HDFC Securities, a Mumbai-based brokerage. While the system allows nodes on the same networks to communicate with each other, Elemential's aspirations go further than that.


The truth about data

Streams of letters of the alphabet erupting from or pouring into a smartphone screen
There are many things that impact the quality and veracity of data throughout its life cycle. Errors can be introduced in the collection process, as it is cleaned or moved across disparate systems. It may have been gathered for a different purpose than what it is now being used for. Or it can simply be too old. When United Airlines recently looked at the data it was using to predict seating demands, the company discovered it was actually data from forecasts that were decades old. This lack of veracity resulted in inaccurate pricing models that cost United Airlines $1 billion (£700 milllion) per annum in missed revenue. It is therefore both surprising and alarming to discover that while 79pc of executives agree that their organisations are basing their most critical systems and strategies on data, many have not invested in the capabilities to verify the truth within it. Without establishing the veracity of that data, businesses leave themselves vulnerable and open to a threat that is critically overlooked.


How DataOps Is Transforming Data Management Practices

Data should be a shared asset, but many companies struggle to treat it as such. Data transcends traditional organizational structures and lines of business, and managers find it difficult to reconcile its governance against traditional business structures. It is not uncommon for data management projects to digress into organizational turf battles. This lack of sharing can result in many different versions of reality, where managers compete to promote their own. When data users don’t trust the data or each other, it’s hard to unlock value. Emerging technology providers think that they’ve found a path forward for building trust through a discipline called Data Operations, or “DataOps.” TAMR’s Palmer has been a pioneer in the field of DataOps, which he describes as “the framework of tools and culture that allow data engineering organizations to deliver rapid, comprehensive and curated data to their users”. He continues, “DataOps enable users to help curate and correct data when they consume it by providing feedback from the point of consumption”.


The biggest challenges for true modernization in 2018

controlling chaos (rudall30/Shutterstock.com)
"It's a great opportunity to have the top cover from the administration and the funding, hopefully, to get this done," one executive said. "But I see another opportunity in my organization to change some things. I'm looking at a culture shift and a kind of mind shift on how we do business. I want to be more adaptable, have more agility and be able to focus on cyber and data, and the only way to do those activities effectively is to change the skill set in-house. We also need to have a new strategy for managing data because I'm looking at things like deep learning and artificial intelligence." Other participants said they, too, are taking advantage of the opportunity to consider dramatic changes. "Our agency had eight CIOs in 10 years — and a year and a half without a CIO," one executive said. "It was constant turmoil. Staffing, hiring, rewarding, contracts — everything was broken. So we decided to blow it all up and start over. And we tell everybody to steal from anybody who's done this already. Let's not reinvent it if you don't have to."


AI In Marketing: Where And When It Can Make A Difference

Today’s CMO is tasked with the challenge of understanding a far greater number of channels, platforms and technologies than ever before. Couple that with the never-ending flow of data coming from every device, method and channel and it’s a recipe for data-processing disaster. The right investment can determine whether a CMO lasts less or more than the average 18-month lifetime. Artificial intelligence offers fascinating possibilities for marketing. While it’s still in its infancy, the power is in the hands of marketers to push for answers to the hard questions. Marketers looking to invest in new technologies must know how and why they’re going to apply them and evaluate how they will solve specific pain points. By working with teams made up of traditional marketers, who focus on the practical applications or technical investment, and more technically savvy computer scientists, who will be responsible for building out and deploying new solutions, CMOs can make far more informed decisions.


Tapping Into Data Capital with AI and Machine Learning


The enterprise data being leveraged includes a complete history of all candidates selected and hired, their key attributes, how they were on-boarded once hired, and their eventual performance in the organization. An analysis engine extracts key features that contributed to candidates’ success and creates a recommendation engine that can rate new applicants along their likelihood to thrive at the organization. Simple data analytics, right? Yes, except that the algorithms, rather than people, decide which factors matter and which do not. Furthermore, the system continually processes ongoing results of those candidates, updating its recommendation engine rules over time. The system learns from actual experience, just like humans do. But it does so far more rapidly and objectively. “Now, extend this capability to other high-value, high-frequency business processes,” Hollis writes. “Timing and pricing of supply chain purchasing. Negotiating discounts on large orders. Measuring the temperature of your customers to determine when a small issue might become a big one. Today’s AI-informed recommendations become tomorrow’s advanced automation.”


Confused about mobile platforms? You’re not alone. Here’s clarity.

maze confused insure future
The very thin thread of evidence for a dual boot into Windows is a reference in the same commit to an internal Google document called “go/vboot-windows.” Trouble is, Google offering Windows on Pixelbooks doesn’t make sense. Google hardware exists to support Google software and services. What makes a little more sense is Fuschia OS as “Alt OS.” (More on Fuschia below.) It’s also possible that Google wants to enable enterprises, schools and developers to more easily dual-boot in whatever OS they want to tinker with as a way to encourage such customers to try Chrome OS. A number of experimental alternative OS projects are being worked on in the Linux community. They include GalliumOS, which is based on Xubuntu and is designed for Chrome OS devices specifically. However, GalliumOS itself contains a script that enables users to dual-boot Chrome OS and GalliumOS. So the answer to the question of whether Chromebooks will run Windows is: Maybe, but probably not.


Moving your data analytics to the cloud isn’t so easy

Moving the data doesn’t magically solve your integration challenges. Also, systems of record may still remain on premises, and so need to be synced with the data now stored in the cloud in a timely manner to get up-to-date results. This means using a mix of old and new data-integration technologies and setting up processes that include data movement and structure transformation. Finally, the cloud-based analytics databases themselves are complex and difficult to configure. Some of that complexity is due to the security subsystems in the database; these are necessary but must be figured out in the context of the database and data analytics. This security must also be systemic with the rest of the systems the data analytics systems touch, both in the cloud and on premises—and that can mean most of the other operational systems that need to feed analytics in real time. Although these cloud analytics challenges can all be overcome, it’s up to IT to understand the level of effort may actually be an 8 out of 10, when it thought (or more likely was told) that it would be a 5 out of 10.


Overcoming hidden data risks when managing third parties

Third party risk management is becoming increasingly top-of-mind for organizations as they attempt to protect their privacy and confidential data and improve their security and risk exposure as part of the overall health of their organization. High-profile breaches, like the one suffered by Target in 2014 or more recently by Netflix in 2017, continue to bring to the forefront the risks third parties can introduce to an organization. As the cloud has increasingly become mainstream, an entirely new set of external risks has been introduced to our environment. Most organizations today rely on several—if not dozens—of external/SaaS applications to run their business, not to mention cloud-based infrastructure and platform offerings. Data ranging from employee vacation time to business documentation to confidential customer information now resides in the cloud, creating a new frontier of risk with which organizations must now contend. For many, the ability to manage this new frontier has not kept pace with the adoption of new, cost-effective technologies to better enable operations.



Quote for the day:


"Program testing can be used to show the presence of bugs, but never to show their absence!" -- Edsger W. Dijkstra


Daily Tech Digest - December 23, 2017

What Metrics Should You Evaluate When Looking at Hyperconverged Infrastructure?


When it comes to hyperconverged infrastructure, some in the IT industry view the merits of hyperconverged infrastructure through the storage lens. This seems logical because hyperconverged technology offers many benefits on how we provision, consolidate, and manage storage. But the metrics that those select few look at are too focused on storage-specific features, such as the number of nodes or terabytes, rather than VM-related measurements commonly used for other software-defined infrastructures such as the cloud. Since hyperconverged infrastructure shifts the paradigm from managing infrastructure components to managing VMs, there should also be a shift in the metrics used to measure it. But with bias present among the vendors, how will customers find the true hyperconverged metrics that matter?



5 Sectors Blockchain Is Disrupting That Are Not Cryptocurrency

5 Sectors Blockchain Is Disrupting That Are Not Cryptocurrency
For a few years now, "blockchain" and "cryptocurrency" have gone hand-in-hand. The blockchain concept is complicated, and involves constant-growth record lists linked together and secured through cryptography (think of the Cryptex from The Da Vinci Code). Each block of the chain envelops a hash pointer relating to the previous block, as well as transaction data and a timestamp. The idea of a blockchain isn't relegated to the infant-era cryptocurrency revolution. Massive worldwide corporations are beginning to incorporate blockchain technology into their systems. The technology behind the blockchain is far more valuable on a global scale than any market capitalization of cryptocurrencies. Here are five large sectors currently being disrupted by the potential of this technology


Europe Unveils Its Vision for a Quantum Future

The commission clearly expects large-scale quantum processing using one or more of these technologies within five to 10 years. Whether this will be done in Europe first is much less clear. Quantum simulation is the third area of investment. Simulating complex quantum properties on an ordinary computer is close to impossible. But quantum systems can be made to simulate aspects of other quantum systems more or less perfectly. Physicists are toying with various ways of doing this. The basic idea is to find a quantum system that is well understood, and easy to manipulate and measure, and then use that to simulate a system that is hard to manipulate and measure. The well-understood systems include ultra-cold atoms and molecules, ions trapped in magnetic fields, and superconducting circuits.


Events, Flows and Long-Running Services: A Modern Approach to Workflow Automation


The idea is backed by the Domain-Driven Design (DDD) community, by providing the nuts and bolts for leveraging domain events and by showing how they change the way we think about systems. Although we are generally supportive of event orientation, we asked ourselves what risks arise if we use them without further reflection. To answer this question we reviewed three common hypotheses:
Events decrease coupling; Central control needs to be avoided; and Workflow engines are painful. ... A more sensible approach to tackle this flow is to implement it in a dedicated service. This service can act as a coordinator, and send commands to the others -- for example, to initiate the payment. This is often a more natural approach, as in this case we would generally not consider this a good design if the Payment service had knowledge about all of its consumers by subscribing to manyfold business events triggering payment retrieval.


Here's What Two Millennial Blockchain Founders Have To Say About Cryptocurrency

With so many different reports, it can be hard to make sense of the cryptocurrency landscape. One thing’s for certain—Bitcoin is just the tip of the iceberg. There are so many promising blockchain projects sprouting up with millennials leading the way. From 24-year-old Vitalik Buterin who founded Ethereum, now the world’s second largest cryptocurrency to 26-year-old Justin Sun, who seeks to reinvent how digital creators get paid for their online content with TRON. ... “We don’t think you even need to hold dollars or pounds in the future we think people will literally be spending with their Mona Lisa tokens or with their gold or with their Apple stock, only what they want to hold not what they think they need to hold simply because it’s the only thing that’s accepted. People will literally be able to walk into McDonalds and pay with their Mona Lisa tokens and that’s why we created this company,” Gelderman says


Can RegTech Really Save Banks Billions Each Year?


The global investment banking industry is worth a few hundred billion dollars annually, as are both the audit and legal professions. And since the last decade or so, increased regulation has forced banks to devote around 10% of their salary costs to employing an army of compliance controllers to ensure that their transactions and processes meet the standards required by the law. And the stakes are high. Rogue traders, breaches of confidentiality, and reckless financial positions can expose financial institutions to fines, cripplingly negative publicity, and even prison sentences, not to mention huge financial losses. These stakes are what make banks the earliest adopters of many technological innovations. Banks are turning to Regulatory Technology (RegTech), chiefly Artificial Intelligence (AI) and Augmented Intelligence (IA) but also other developments in computing like blockchain


While Bitcoin Price Soars, Technological Advancements Continue in the Background

While Bitcoin Price Soars, Technological Advancements Continue in the Background
As such, it helps to assimilate any new or additional information in the context to help make more sense of it in comparison to other experiences. For example, imagine your buddy invites you to "catch some waves" and to your surprise, after two hours on the road you finally pull up to an indoor resort water park where they have one of those cool new "wave pools;" the waves are generated mechanically and are meant to impress, but not utterly frighten well-meaning vacationers. This is not the same as a trip to the beach right. The same can be said of traditional investment vehicles vs. cryptocurrencies and assets. Some key interactions with each are very familiar; however, the context of operating within a purely virtual universe where the data is publicly distributed and infrastructure is community owned is very important to how you choose to engage.


Our top 7 cyber security predictions for 2018

predictions crystalball
The Equifax and Anthem breaches were wake-up calls for many consumers, who are now asking questions about the safety of their online accounts. Most still have no idea about password alternatives or enhancements like multi-factor authentication (MFA) or risk-based authentication, but they are more aware that passwords alone no longer are enough. In fact, research done by Bitdefender shows that U.S. citizens are more concerned about stolen identities (79 percent) than email hacking (70 percent) or home break-ins (63 percent). This is important, because companies often cite a lack of demand for stronger authentication as a reason for not offering it. ... State-sponsored attacks might also spur countries to form alliances to fight them. “Increased attacks on critical infrastructure will drive countries to begin discussing cybersecurity alliances. Establishing these alliances will provide mutual defense for all countries involved and it will allow for the sharing of intelligence in the face of attributed nation-state attacks, not to mention agreements to not attack each other,” says Eddie Habibi, CEO of PAS Global.


Agile for Marketing and Communication

Agile ensures movement, flexibility, and connection, and ensures that the right people are involved in communication. It also provides communication professionals with tools to keep a grip on the development of communication and the use of resources within the field of internal stakeholders. This way you can cope better with change and be more in control of the project schedule and state. It also provides self-organizing teams that take their own responsibility and add value to the product that’s being delivered. Therefore, it helps to finish assignments in a short period of time by focusing and making prior choices. During the preparation of the event RIVM Kennisparade for example, I only interfered one time with the progress when I was asked by the product owner. Because we’ve directly involved users, stakeholders and the necessary other organizational disciplines in the process, we have ensured support during the whole organization of the event. And that is a very good way to add value to our products.


The internet is broken

The internet was built on decades-old technology. Today, the internet comprises billions of devices, every one of which is more powerful than those upon which the internet and the web were built. Storage is exponentially cheaper and wireless technologies mean that countries are developing web infrastructures that aren't built on undersea cables. Our phones can scan our fingerprints and faces, making payments secure. Emerging technologies such as the blockchain enable experiments in new models for file sharing and value exchange. So let's consider a thought experiment: if we were to reset the internet - shut everything down and start again, using 30 or so years of experience - would it still look the same? Or would we design something different… even better?



Quote for the day:


"The sign of a beautiful person is that they always see beauty in others." -- Omar Suleiman