Daily Tech Digest - November 30, 2022

7 lies IT leaders should never tell

Things break, and in most cases, it comes as a surprise. IT consists of many systems requiring different degrees of connectivity and monitoring, making it difficult to know absolutely everything at every moment. The key to minimizing failures is to be proactive rather than simply waiting for bad things to happen. CIOs should not only expect things to break but also be honest about this with their team members and business colleagues. “Eat, sleep, and live that life,” advises Andre Preoteasa, internal IT director at IT business management firm Electric. “There are things you know, things you don’t know, and things you don’t know you don’t know,” he observes. “Write down the first two, then think endlessly about the last one — it will make you more prepared for the unknowns when they happen.” Preoteasa stresses the importance of building and maintaining detailed disaster recovery and business continuity plans. “IT leaders that don’t have [such plans] put the company in a bad position,” he notes. “The exercise alone of writing things down shows you’re thinking about the future.”


Amid Legal Fallout, Cyber Insurers Redefine State-Sponsored Attacks as Act of War

Acts of war are a common insurance exclusion. Traditionally, exclusions required a "hot war," such as what we see in Ukraine today. However, courts are starting to recognize cyberattacks as potential acts of war without a declaration of war or the use of land troops or aircraft. The state-sponsored attack itself constitutes a war footing, the carriers maintain. ... Effectively, Forrester's Valente notes, larger enterprises might have to set aside large stores of cash in case they are hit with a state-sponsored attack. Should insurance carriers be successful in asserting in court that a state-sponsored attack is, by definition, an act of war, no company will have coverage unless they negotiate that into the contract specifically to eliminate the exclusion. When buying cyber insurance, "it is worth having a detailed conversation with the broker to compare so-called 'war exclusions' and determining whether there are carriers offering more favorable terms," says Scott Godes, partner and co-chair of the Insurance Recovery and Counseling Practice and the Data Security & Privacy practice at District of Columbia law firm Barnes & Thornburg.


Top 5 challenges of implementing industrial IoT

Scalability is another challenge faced by professionals trying to make progress with their IIoT implementations. Bain’s 2022 study of IIoT decision-makers indicated that 80% of those who purchase IIoT technology scale fewer than 60% of their planned projects. The top three reasons why those respondents failed to scale their projects were that the integration effort was overly complicated and required too much effort, the associated vendors could not support scaling, and the life cycle support for the project was too expensive or not credible. One of the study’s takeaways was that hardware could help close gaps that prevent company decision-makers from scaling. Another best practice is for people to take a long-term viewpoint with any IIoT project. Some people may only think about what it will take to implement an initial proof of concept. That’s just a starting point. They’ll have to look beyond the early efforts if they want to eventually scale the project, but many of the things learned during the starting phase of a project can be beneficial to know during later stages.


AWS And Blockchain

The customer CIO, an extremely smart person, spoke up, in beautifully-rounded European vowels: “Here’s a use case I’ve been told about that’s on my mind.” He named a region in Asia and explained that the small farmers there mark their landholdings carefully, but then the annual floods sometimes wash the markers away. Then unscrupulous larger landowners use the absence of markers to cut away at the smallholdings of the poorest. “But if the boundary markers were on the blockchain,” he said, “they wouldn’t be able to do that, would they?” ... I thought. Then said “As a lifelong technologist, I’ve always been dubious about technology as a solution to a political problem. It seems a good idea to have a land-registry database but, blockchain or no, I wonder if the large landowners might be able to find another way to fiddle the records and still steal the land? Perhaps this is more about power than boundary markers?” Later in the ensuing discussion I cautiously offered something like the following, locking eyes on the CIO: “There are many among Amazon’s senior engineers who think blockchain is a solution looking for a problem.” He went entirely expressionless and the discussion moved on.

The key message is that before persisting the data into the storage layers (Bronze, Silver, Gold), the data must pass data quality checks and for the corrupted data records that fail the data quality checks to be dealt with separately, before they are written into the storage layer. ... The “Bronze => Silver => Gold” pattern is a type of data flow design , also called a medallion architecture. A medallion architecture is designed to incrementally and progressively improve the structure and quality of data as it flows through each layer of the architecture. This is why it is relevant for today’s article regarding data quality and reliability. ... Generally the data quality requirement become more and more stringent as the data flows from raw to bronze to silver and to gold as the gold layer directly serves the business. You should, by now, have a high-level understanding of what a medallion data design pattern is and why it is relevant for a data quality discussion.


The Digital Skills Gap is Jeopardising Growth

With people staying in workforces longer than ever before and careers spanning five decades becoming the norm, upskilling at a massive scale is needed. However, this need is not fully addressed; a worrying 6 in 10 (58%) people we surveyed in the UK told us that they have already been negatively affected by a lack of digital skills. Organisations can’t just rely on recruiting from a limited pool of digital specialists. More focus is also needed by organisations to upskill their own employees, in both tech and human digital skills. At a recent digital skills panel debate in Manchester, the director of a recruitment agency stated bluntly that: “Many businesses are currently overpaying to bring in external digital skills because of increased competition and this just isn’t sustainable. Upskilling your current teams should be as important as recruiting in new talent to keep costs in check and create a more balanced and loyal workforce.” It’s crucial to upskill employees, not only to get the necessary digital capabilities in our organisations, but to build loyalty and retain valued team members.


Emerging sustainable technologies – expert predictions

AI and automation technologies offer a smart solution, too; they could channel energy when it is plentiful into less time-sensitive uses, such as charging up electric vehicles or heating storage heaters. For example, Drax has looked at ways of combining AI with smart meters to channel our energy use, so that we take advantage of those periods when energy creation exceeds demand. The debate over whether we need new technologies or just need to scale-up existing sustainable technologies has even reached the higher echelons of power. John Kerry, US special presidential envoy for climate, and a certain Bill Gates say we need technologies which haven’t been invented yet. World-renowned climate change scientist Michael Mann disagrees. In his expert opinion, we just need to scale up existing technologies. ... But there is one other application — an application which will create extraordinary opportunity and open the way for many technologies we have been considering up to now. When all of our power is provided by renewables, the total annual supply is likely to exceed total annual demand by a large margin.


Women in IT: Progress in Workforce Culture, But Problems Persist

From Milică's perspective, the greatest challenge facing women in IT today is a lack of role models. “Women need to be the role models who can inspire young minds, especially more women and minority leaders,” she says. “Even at the individual level, each of us -- teachers, parents, and other influential adults -- can plant the seed and grow the understanding among young people of the importance of IT jobs, and how that career path can make a difference in our world and society.” She adds hiring bias and pay inequality, along with the lack of female role models, leaders, and advancement opportunities, all discourage women from pursuing a STEAM career. “Women have to work much harder both to get hired and to advance their careers -- which perhaps explains why 52% of women in cybersecurity hold postgraduate degrees, compared to only 44% of men,” Milică notes. She adds the industry also hasn’t done a great job sparking interest at an early age. “Attention to a career path starts with children as early as elementary school, and by middle or high school, many students will have made their decisions,” she explains.


EPSS explained: How does it compare to CVSS?

EPSS aims to help security practitioners and their organizations improve vulnerability prioritization efforts. There are an exponentially growing number of vulnerabilities in today’s digital landscape and that number is increasing due to factors such as the increased digitization of systems and society, increased scrutiny of digital products, and improved research and reporting capabilities. Organizations generally can only fix between 5% and 20% of vulnerabilities each month, EPSS claims. Fewer than 10% of published vulnerabilities are ever known to be exploited in the wild. Longstanding workforce issues are also at play, such as the annual ISC2 Cybersecurity Workforce Study, which shows shortages exceeding two million cybersecurity professionals globally. These factors warrant organizations having a coherent and effective approach to aid in prioritizing vulnerabilities that pose the highest risk to their organization to avoid wasting limited resources and time. The EPSS model aims to provide some support by producing probability scores that a vulnerability will be exploited in the next 30 days and the scores range between 0 and 1 or 0% and 100%.


Could it be quitting time?

The book tackles a challenge that proves stubbornly difficult for most people. Letting go of anything is hard, especially at a time when pundits tout the power of grit, building resilience, and toughing it out. Duke provides permission to see quitting as not only viable but often preferable, and she explains why people rarely give up at the right time. “Quitting is hard, too hard to do entirely on our own,” she writes. “We as individuals are riddled by the host of biases, like the sunk cost fallacy, endowment effect, status quo bias, and loss aversion, which lead to escalation of commitment. Our identities are entwined in the things that we’re doing. Our instinct is to want to protect that identity, making us stick to things even more.” These biases—some of them unconscious—prompt us to stick with jobs that have lost their appeal or value; hold on to losing stocks long after an inner voice screams “Sell!”; or endure myriad other situations that no longer serve us. Duke focuses far more on the thinking behind the decision to “quit or grit” rather than on the decision’s final outcomes.



Quote for the day:

"Teamwork is the secret that make common people achieve uncommon result." -- Ifeanyi Enoch Onuoha

Daily Tech Digest - November 29, 2022

Cloud-Native Goes Mainstream as CFOs Seek to Monitor Costs

There's interest from the CFO organization in third-party tools for cloud cost management and optimization that can give them a vendor-neutral tool, especially in multicloud environments, according to Forrester analyst Lee Sustar. "The cost management tools from cloud providers are generally fine for tactical decisions on spending but do not always provide the higher level views that the CFO office is looking for," he added. As organizations move to a cloud-native strategy, Sustar said the initiative will often come from the IT enterprise architects and the CTO organization, with backing from the office of the CIO. "Partners of various sorts are often needed in the shift to cloud-native, as they help generalize the lessons from the early adopters," he noted. "Today, organizations new to the cloud are focused not on lifting and shifting existing workloads alone, but modernizing on cloud-native tech. Multicloud container platform vendors offer a more integrated approach that can be tailored to different cloud providers, Sustar added.


Financial services increasingly targeted for API-based cyberattacks

APIs are a core part of how financial services firms are changing their operations in the modern era, Akamai said, given the growing desire for more and more app-based services among the consumer base. The pandemic merely accelerated a growing trend toward remote banking services, which led to a corresponding growth in the use of APIs. With every application and every standardization of how various app functions talk to one another, which creates APIs, the potential target surface for an attacker increases, however. Only high-tech firms and e-commerce companies were more heavily targeted via API exploits than the financial services industry. “Once attackers launch web applications attacks successfully, they could steal confidential data, and in more severe cases, gain initial access to a network and obtain more credentials that could allow them to move laterally,” the report said. “Aside from the implications of a breach, stolen information could be peddled in the underground or used for other attacks. This is highly concerning given the troves of data, such as personal identifiable information and account details, held by the financial services vertical.”


The future of cloud computing in 2023

Gartner research estimates that we exceeded one billion knowledge workers globally in 2019. These workers are defined as those who need to think creatively and deliver conclusions for strategic impact. These are the very people that cloud technology was designed to facilitate. Cloud integrations in many cases can be hugely advanced and mature from an operational standpoint. Businesses have integrated multi-cloud solutions, containerization and continuously learning AI/ML algorithms to deliver truly cutting-edge results, but those results are often not delivered at the scale or speed necessary to make split-second decisions needed to thrive in today’s operating environment. For cloud democratization to be successful, companies need to upskill their knowledge workers and upskill them with the right tools needed to deliver value from cloud analytics. Low-code and no-code tools reduce the experiential hurdle needed to deliver value from in-cloud data, whilst simultaneously delivering on the original vision of cloud technology — giving people the power they need to have their voices heard.


What Makes BI and Data Warehouses Inseparable?

Every effective BI system has a potent DWH at its core. Just because a data warehouse is a platform used to centrally gather, store, and prepare data from many sources for later use in business intelligence and analytics. Consider it as a single repository for all the data needed for BI analyses. Historical and current data are kept structured, ideal for sophisticated querying in a data analytics DWH. Once connected, it produces reports with forecasts, trends, and other visualizations that support practical insights using business intelligence tools. ETL (extract, transform, and load) tools, a DWH database, DWH access tools, and reporting layers are all parts of the business analytics data warehouse. These technologies are available to speed up the data science procedure and reduce or completely do away with the requirement for creating code to handle data pipelines. The ETL tools assist in data extraction from source systems, format conversion, and data loading into the DWH. Structured data for reporting is stored and managed by the database component. 


Covering Data Breaches in an Ethical Way

Ransomware and extortion groups usually publicly release stolen data if a victim doesn't pay. In many cases, the victim organization hasn't publicly acknowledged it has been attacked. Should we write or tweet about that? ... These are victims of crime, and not every organization handles these situations well, but the media can make it worse. Are there exceptions to this rule? Sure. If an organization hasn't acknowledged an incident but numerous media outlets have published pieces, then the incident could be considered public enough. But many people tweet or write stories about victims as soon as their data appears on a leak site. I think that is unfair and plays into the attackers' hands, increasing pressure on victims. Covering Cybercrime Sensitively Using leaked personal details to contact people affected by a data breach is a touchy area. I only do this in very limited circumstances. I did it with one person in the Optus breach. The reason was at that point there were doubts about if the data had originated with Optus. The person also lived down the road from me, so I could talk to them in person.


EU Council adopts the NIS2 directive

NIS2 will set the baseline for cybersecurity risk management measures and reporting obligations across all sectors that are covered by the directive, such as energy, transport, health and digital infrastructure. The revised directive aims to harmonise cybersecurity requirements and implementation of cybersecurity measures in different member states. To achieve this, it sets out minimum rules for a regulatory framework and lays down mechanisms for effective cooperation among relevant authorities in each member state. It updates the list of sectors and activities subject to cybersecurity obligations and provides for remedies and sanctions to ensure enforcement. The directive will formally establish the European Cyber Crises Liaison Organisation Network, EU-CyCLONe, which will support the coordinated management of large-scale cybersecurity incidents and crises. While under the old NIS directive member states were responsible for determining which entities would meet the criteria to qualify as operators of essential services, the new NIS2 directive introduces a size-cap rule as a general rule for identification of regulated entities.


Cybersecurity: How to do More for Less

When assessing your existing security stack, several important questions need to be asked: Are you getting the most out of your tools? How are you measuring their efficiency and effectiveness? Are any tools dormant? And how much automation is being achieved? The same should be asked of your IT stack–is there any bloat and technical debt? Across your IT and security infrastructure, there are often unnecessary layers of complexity in processes, policies and tools that can lead to waste. For example, having too many tools leads to high maintenance and configuration overheads, draining both resources and money. Similarly, technologies that combine on-premises infrastructure and third-party cloud providers require complex management and processes. IT and cybersecurity teams, therefore, need to work together with a clear shared vision to find ways to drive efficiency without reducing security. This requires clarity over roles and responsibilities between security and IT teams for asset management and deployment of security tools. It sounds straightforward but often is not, due to historic approaches to tool rollout.


Being Agile - A Success Story

To better understand the Agile methodology and its concepts, it is crucial to understand the Waterfall methodology. Waterfall is another famous Software Development Life Cycle (SDLC) methodology. This methodology is a strict and linear approach to software development. It aims at a significant project outcome. On the other hand, Agile methodology is an iterative method that delivers results in short intervals. Agile relies on integrating a feedback loop to drive the next iteration of work. The diagram below describes other significant differences between these methodologies. In Waterfall, we define and fix the scope and estimate the resources and time to complete the task. In Agile, the time and resources are fixed (called an "iteration"), and the work is estimated for every iteration. Agile helps estimate and evaluate the work that brings value to the product and the stakeholders. It is always a topic of debate as to which methodology to use for a project. Some projects are better managed with Waterfall, while others are an excellent fit for Agile. 


User Interface Rules That Should Never Be Overlooked

The most important user interface design rule that should never be overlooked is the rule of clarity. Clarity is critical when it comes to user interfaces, says Zeeshan Arif, founder and CEO of Whizpool, a software and website development company. “When you're designing an interface, you need to make sure your users understand what they can do at all times,” Arif advises. This means making sure that buttons are correctly labeled and that there aren't any unexpected changes or surprises that might confuse users. “If a button says ‘delete’, then it should delete whatever it's supposed to delete -- and only that thing,” he says. “If you have a button that does something else, then either make it a different color or label it differently, but don't put in something in that looks like a delete button but doesn't actually delete anything.” Don't perplex users by designing a user interface crammed with superfluous options and/or features. “If you have too many buttons on one page, and none of them are labeled well enough for someone who isn't familiar with them, [users will] probably just give up before they even get started using your product, service, app, or website,” Arif says.


6 non-negotiable skills for CIOs in 2023

CIOs need to think about both internal integrations and external opportunities. They need to have strong relationships and be able to pull the business leaders together. For example, I’m working with an entrepreneurial organization that runs different lines of businesses that are very strong, with heads of those businesses who are also very strong. One of their challenges, however, is that their clients can be customers of multiple businesses. Between the seams, the client experiences the organizational structure of the business, which is a problem – a client should never experience your organizational structure. The person best equipped to identify and close those seams and integration points is the CIO. ... In the past, most organizations operated with a business group that sat between technology and the clients. The movement around agile, however, has knocked those walls down and today allows IT to become client-obsessed – we’re cross-functional teams that are empowered and organized around business and client outcomes. As a CIO, you need to spend time with clients and have a strong internal mission, too. You have to develop great leaders and motivate and engage an entire organization.



Quote for the day:

"A leader has the vision and conviction that a dream can be achieved._ He inspires the power and energy to get it done." -- Ralph Nader

Daily Tech Digest - November 28, 2022

5 ways to avoid IT purchasing regrets

When it comes to technology purchases, another regret can be not moving fast enough. Merim Becirovic, managing director of global IT and enterprise architecture at Accenture, says his clients often wonder whether they’re falling behind. “With the level of technology maturity today, it’s a lot easier to make good decisions and not regret them. But what I do hear are questions around how to get things done faster,” he says. “We’re getting more capabilities all the time, but it’s all moving so quickly that it’s getting harder to keep up.” A lag can mean missed opportunities, Becirovic says, which can produce a should-have-done-better reproach. “It’s ‘I wish I had known, I wish I had done,’” he adds. Becirovic advises CIOs on how to avoid such scenarios, saying they should make technology decisions based on what will add value; shift to the public cloud to create the agility needed to keep pace with and benefit from the quickening pace of IT innovation; and update IT governance practices tailored to overseeing a cloud environment with its consumption-based fees.


5 digital transformation metrics to measure success in 2023

If money (whether earned or saved) is the first pillar of most business metrics, then time is another. That could be time spent or saved (more on that in a moment), but it’s also in the sense of pure speed. "Time to market should be one of the most critical digital transformation metrics right now for enterprises across industries,” says Skye Chalmers, CEO of Image Relay. “The market impact of a digital transformation project is all about its speed: If you don’t cross the finish line first with compelling new customer [or] employee experiences or other digital modernization initiatives, your competitors will.” So while an overall digital transformation strategy may not have an endpoint, per se, the goals or milestones that comprise that strategy should have some time-based measurement. And from Chalmers’ point of view, the speed with which you can deliver should be a key factor in decision-making and measurement. Focusing on the time-to-market metric “will directly improve an enterprise’s competitive position and standing with customers,” Chalmers says.


More Organizations Are Being Rejected for Cyber Insurance — What Can Leaders Do?

Before soliciting cyber insurance quotes, examine several areas of your network security to understand what vulnerabilities exist. Insurers will do just that, so anticipating gaps in your infrastructure, software, and systems will provide you with a clearer idea of what your company needs. Start with your enterprise network. Who has access and to what degree? Every person who has access to your network provides an attack vector, increasing the possibility of an attacker accessing more data through lateral movement. If an outside agent can gain entry to your network, that person or bot can harvest the most privileged credentials and move between servers and throughout the storage infrastructure while continually exploiting valuable sensitive data. That’s why most insurance audits consider privilege sprawl to be among the top risks. It happens when special rights to a system have been granted to too many people. It impacts the cost of premiums and could even lead to a loss of coverage. Public cloud assets also present an opportunity for a strike. Is access to that information secure? 


Retirees Must Have These Four Key Components To Make A Winning Side Hustle

Since when does everything always go as planned? Spoiler Alert: It never does. There’s even a saying for this: “Into each life, a little rain must fall.” And when those rain clouds do appear, what do successful entrepreneurs do? They don’t pack up their gear and head for shelter. No, they plant their feet firmly into the (muddy) ground and start selling umbrellas. “When you study success and read extensively about entrepreneurs, you realize that successful people come from a variety of backgrounds and circumstances, but they have one thing in common—they consistently do the work,” says Case Lane, Founder of Ready Entrepreneur in Los Angeles. “The only talent needed is knowing you can make that commitment to keep working to ensure business success.” Entrepreneurs don’t fear change (see above); they see it as an opportunity. “I knew how to solve a problem that many people were experiencing, and I knew I could help those people,” says Chane Steiner, CEO of Crediful in Scottsdale, Arizona. 


Top 6 security risks associated with industrial IoT

Device hijacking is one of the common security challenges of IIoT. It can occur when the IoT sensor or endpoint is hijacked. This can lead to serious data breaches depending on the intelligence of the sensors as well as the number of devices the sensors are connected to. Sensor breaches or hijacks can easily expose your sensors to malware, enabling the hacker to have control of the endpoint device. With this level of control, hackers can run the manufacturing processes as they wish. ... IIoT deals with many physical endpoint devices that can be stolen if not protected from prying eyes. This situation can pose a security risk to any organization if these devices are used to store sensitive information. Organizations with endpoint devices in great use can make arrangements to ensure that these devices are protected, but storing critical data in them can still raise safety concerns due to the growing number of endpoint attacks. For organizations to minimize the risk associated with device theft, it’s expedient to avoid storing sensitive information on endpoint devices. Instead, they should use cloud-based infrastructure to store critical information.


Cloud security starts with zero trust

Generally speaking, the best way for an organization to approach zero trust is for security teams to take the mindset that the network is already compromised and develop security protocols from there. With this in mind, when implementing zero trust into a cloud environment, organizations must first perform a threat assessment to see where their biggest vulnerabilities lie. Zero trust strategy requires an inventory of every single item in a company’s portfolio, including a list of who and what should and should not be trusted. Additionally, organizations must develop a strong understanding of their current workflows and create a well-maintained inventory of all the company’s assets. After conducting a thorough threat assessment and developing an inventory of key company information, security controls must be specifically designed to address any threats identified during the threat assessment to tailor the zero trust strategy around them. The nature of zero trust is inherently complex due to the significant steps that a company has to take to achieve a true zero trust atmosphere, and this is something that more businesses should take into account.


How to Not Screw Up Your Product Strategy

Creating the strategy also requires influencing and collaborating with many people. All of these interactions require time to get people on the same page, discuss disagreements, and incorporate improvements or changes. Finally, your market can change quickly. New competitors can emerge, technologies change, and customer feedback can shift. These all can result in changes in perspective or emphasis, which can further slow down putting together a product strategy. And finally, even after you’ve done all the hard work putting the strategy together, you have a lot of work to do communicating that strategy and getting people to understand it. This also takes a lot of time. The end result of all these steps is that a common failure mode is “the product strategy is coming." My recommendation is to always have a working product strategy. Because strategy work takes time, you shouldn’t make people wait for it. If you don’t have a real strategy, start with a temporary, short-term strategy, based on your best thinking at the moment. 


Why Microsegmentation is Critical for Securing CI/CD

While cloud-native application development has many benefits, traditional network architectures and security practices cannot keep up with DevOps practices like CI/CD. Microsegmentation reduces network risk and prevents lateral movement by isolating environments and applications. However, it can be a challenge to implement segmentation in a cloud-native environment. Typical network security teams use a centralized approach with one SecOps team responsible for all security management. For example, some networks have ticket-based approval systems where the central team reviews each request based on access policies. However, this system is slow and prone to human error. Teams can use DevOps methods to operationalize microsegmentation, implementing policy as code. You can also leverage a microsegmentation solution that helps automate and secure the process. The security team enforces basic segmentation policies, while application owners create more granular policies. This decentralized security approach preserves the agility of DevOps.


Data Strategy: Synthetic Data and Other Tech for AI's Next Phase

Synthetic data is one of several AI technologies identified by Forrester as less well known but having the power to unlock significant new capabilities. Others on the list are transformer networks, reinforcement learning, federated learning and causal inference. Curran explains that transformer networks use deep learning to accurately summarize large corpuses of text. “They allow for folks like myself to basically create a pretty concise slide based off of a piece of research I’ve written,” he says. “I already use AI-generated images in probably 90% of my presentations at this point in time.” The same base technology of transformer networks and large language models can be used to generate code for enterprise applications, Curran says. Reinforcement learning allows tests of many actions in simulated environments, enabling a large number of micro-experiments that can then be used for constructing models to optimize objectives or constraints, according to Forrester. ... Such a simulation would let you account for your big order, the cost of shutting down at peak season, and other factors in your decision of whether to take that piece of equipment down for maintenance.


Smart office trends to watch

A growing number of office buildings now have an effective Building Management System (BMS). Ideally this will be combined with energy generation and storage and water management systems, which can deliver huge cost, resource and emissions savings, but a good BMS is a good start. It can optimise energy use through smart lighting and temperature systems, controlled by software which draws information from Internet of Things (IoT) or Radio Frequency Identification (RFID) sensors throughout the building. Energy and cost savings are also improved by smart LED lighting, controlled by sensors that ensure it is only used as and when needed. Providers of BMS and related solutions include Smarter Technologies, which uses RFID sensors to monitor energy and water use, temperature, humidity, air quality, room or desk occupancy and even whether bins need emptying. SP Digital’s GET Control system offers IoT and AI-based temperature control, dividing open plan offices into microzones, through which air flow is regulated based on occupancy and both conditions inside and ambient weather conditions outside the building. 



Quote for the day:

"In simplest terms, a leader is one who knows where he wants to go, and gets up, and goes." -- John Erksine

Daily Tech Digest - November 27, 2022

Business Case – Why Enterprise Architecture Needs to Change – Part I

The solution to moving out of the “stone age” is to use a digital end-to-end approach for Architecture content (whether EA or SA), and provide openness and transparency across EA, project, and reusable component Architectures. Just like any digital approach to any business problem, the use of structured data is key. The best-structured data language for Architecture is arguably the ArchiMate notation which has a rich notation covering the depth and breadth of Architecture modelling, and also a rich set of connectors to link elements. ... Even if the new hire has significant experience in the given industry, the new organisation’s IT platform and processes will likely vary greatly from the person’s past experience. It takes several months or longer for new staff to accumulate enough knowledge about how the business and IT platform work to operate effectively without help from other staff and operate effectively. The cost of this knowledge gap is the new person delivering outcomes slower than other staff and consuming time of other staff unnecessarily by simply asking questions like ‘what systems do we have?’, ‘what does the business do?’, ‘how does system X work?’ and so on.


Why API security is a fast-growing threat to data-driven enterprises

API security focuses on securing this application layer and addressing what can happen if a malicious hacker interacts with the API directly. API security also involves implementing strategies and procedures to mitigate vulnerabilities and security threats. When sensitive data is transferred through API, a protected API can guarantee the message’s secrecy by making it available to apps, users and servers with appropriate permissions. It also ensures content integrity by verifying that the information was not altered after delivery. “Any organization looking forward to digital transformation must leverage APIs to decentralize applications and simultaneously provide integrated services. Therefore, API security should be one of the key focus areas,” said Muralidharan Palanisamy, chief solutions officer at AppViewX. Talking about how API security differs from general application security, Palanisamy said that application security is similar to securing the main door, which needs robust controls to prevent intruders. At the same time, API security is all about securing windows and the backyard.

 

Artificial Intelligence Can Enhance Banking Compliance

Technology has changed our society, and banks and other financial institutions have digitalized their operations at a rapid pace as well. However, the financial crime compliance units of these institutions still rely mainly on heavy manual processes. The banking compliance units’ key reason for their cautious approach in the utilisation of AI and automation has been uncertainty about technology. Do regulators approve machine-based decision-making, and is machine learning logic fair in identifying suspicious activities? However, there is a clear need for utilising technology in financial crime compliance. During the last number of years, Ireland has witnessed a rise in financial crime, with illegal proceeds making their way into the financial system, often from international sources. Last month, data from Banking and Payments Federation Ireland showed that over €12m was transferred illegally through so-called ‘money mule’ accounts in the first six months of the year. When compared to the same period last year, the quantity of bank accounts linked to the criminal practice in Ireland almost doubled to 3,000 between January and June 2022.


Big tech has not monopolized big A.I. models, but Nvidia dominates A.I. hardware

Interest in A.I. software startups targeting business use cases also remains formidable. While the total amount invested in such companies fell 33% last year as the venture capital market in general pulled back on funding in the face of fast-rising interest rates and recession fears, the total was still expected to reach $41.5 billion by the end of 2022, which is higher than 2020 levels, according to Benaich and Hogarth, who cited Dealroom for their data. And the combined enterprise value of public and private software companies using A.I. in their products now totals $2.3 trillion—which is also down about 26% from 2021—but remains higher than 2020 figures. But while the race to build A.I. software may remain wide open for new entrants, the picture is very different when it comes to the hardware on which these A.I. applications run. Here Nvidia’s graphics processing units completely dominate the field and A.I.-specific chip startups have struggled to make any inroads. The State of AI notes that Nvidia’s annual data center revenue alone—$13 billion—dwarfs the valuation of chip startups such as SambaNova ($5.1 billion), Graphcore ($2.8 billion) and Cerebras ($4 billion). 


Predictive Analytics in Healthcare

Clinicians, healthcare associations and health insurance companies use predictive analytics to articulate the probability of their cases developing certain medical conditions, similar as cardiac problems, diabetes, stroke or COPD. Health insurance companies were early adopters of this technology, and healthcare providers now apply it to identify which cases need interventions to avert conditions and enhance health outcomes. Clinicians also use predictive analytics to identify cases whose conditions are progressing into sepsis. As is the case with numerous operations of predictive analytics in healthcare, still, the capability to use this technology to read how a case’s condition might progress is limited to certain conditions and far from widely deployed. Healthcare associations also use predictive analytics to identify which hospital in patients are probable to exceed the average length of stay for their conditions by assaying case, clinical and departmental data. This insight allows clinicians to acclimate care protocols to observe the cases’ treatments and recoveries on track. That in turn helps cases avoid overstays, which not only drive up expenses and divert limited hospital resources, but also may endanger cases by keeping them in surroundings that could expose them to secondary infections.


How to Set Yourself Up For Success As a New Data Science Consultant With No Experience

The key is to know what you’re good at and focus on it. Going out on your own as a consultant is scary enough — ensure that you’re going to be marketing and using skills that you’re comfortable with. Having confidence that you can successfully produce results using your tools and skills of choice goes a long way to becoming a successful consultant. Additionally, do some market research to see where your niche could lay. While they say that data scientists should all be generalists in the beginning, I believe that consultants should focus on specializing themselves in niches that complement their skills and their alternative knowledge. For example, I would focus on becoming a data science consultant who specializes in helping companies solve their environmental problems — this would combine my specialized skills (data science) with my alternative knowledge and educational background in environmental science. Companies love working with consultants who have first-hand experience in their sector, so it can’t hurt to play to your strengths, past employment, education, or interest background.


The future of employment in IT sector

Whilst the companies keep up with the changing economic climate, what’s become undeniable is the war for recruiting good talent, now more than ever. There has been a significant change in employees’ needs and priorities. Cream talent is re-evaluating their careers based on aspects like flexibility, career growth and employee value proposition. Companies must therefore invest in ‘Active Sourcing’ to create a rich pipeline and not only recruit them but also train them for the upcoming 4th industrial revolution. It needs to invest in their skills and holistic development, not forgetting to create a safe, healthy work environment to retain the talent. As dynamic as it is, one cannot deny the menace of tech burnout. This blog describes it perfectly, ‘Tech burnout refers to the extreme exhaustion and stress that many employees in the technology sector experience. While burnout has always been an issue in many industries, 68% of tech workers feel more burned out than they did when they worked at an office.’ Technology is the most rapidly evolving industry with a challenging work environment.


On the Psychology of Architecture and the Architecture of Psychology

Most of our intelligence, however, consists of patterns that we execute efficiently, automatically and quickly. Some of these are natural elements, which are fixed: e.g. a propensity to communicate and use tools, to perform ‘mental travel’ — memory, scenarios, fantasy — and all of it based on pattern creation and reinforcement. Some of these elements may even be genetic (like basic strategies such as wait-and-see versus go-for-it you can observe in small children), but most of it is probably learned. All of this is part of Kahneman’s ‘System 1’. We learn by employing our capability to employ logic and ratio and our copying-and-being-reinforced capability — and while we do a lot more of the latter two than the former, culturally, we tend to believe that the reverse is true. Learning by reinforcement also includes learning by doing. Chess grand masters have very effective fast ‘patterns’ in the ‘malleable instinct’ part of their brains, and the difference between grand master and good amateurs is not their power of logic and ratio — calculating, thinking moves ahead — but their patterns that identify potential good moves before they start to calculate , and these patterns come from playing a lot of games. You also have to maintain your patterns: it is ‘use it or lose it’.


7 Common Data Quality Problems

Data inconsistencies: This problem occurs when multiple systems are storing information without using an agreed upon, standardized method of recording and storing information. Inconsistency is sometimes compounded by data redundancy. ... Fixing this problem requires the data be homogenized (or standardized) before or as it comes in from various sources, possibly through the use of an ETL data pipeline. Incomplete data: This is generally considered the most common issue impacting Data Quality. Key data columns will be missing information, often causing analytics problems downstream. A good method for solving this is to install a reconciliation framework control. This control would send out alerts (theoretically to the data steward) when data is missing. Orphaned data: This is a form of incomplete data. It occurs when some data is stored in one system, but not the other. If a customer’s name can be listed in table A, but their account is not listed in table B, this would be an “orphan customer.” And if an account is listed in table B, but is missing an associated customer, this would be an “orphan account.”


Building IT Infrastructure Resilience For A Digitally Transformed Enterprise

At a minimum, resiliency means having stable operations, consistent revenue, manageable security risks, efficient workflows, and an informed and agile employee base. Having visibility over the operating systems of network devices can reduce network downtime and open doors to further efficiencies. If a business is resilient, it can maintain stable network operations, drive down IT costs and deliver a more robust service at a lower cost. Overall, when businesses can dramatically lower IT expenses and have better visibility, they can expend resources on separate projects that improve the quality of service—a win for all. From a regulatory perspective, regulators now want to see everything documented. Take mobile banking, for example; regulators want to know everything, including what code is being used on which servers as well as which people and processes have access to which services. Intelligently automated network operations can allow enterprises to be better equipped to answer the questions that regulators ask, such as how they're validating and how often they're doing a failover. 



Quote for the day:

"A good general not only sees the way to victory; he also knows when victory is impossible." -- Polybius

Daily Tech Digest - November 26, 2022

How automation can solve persistent cybersecurity problems.

Think about the normal day for a security analyst. If we’re expecting them to handle alerts, events that have come up, and new attacks that are happening right now—that’s a lot of new information to look at and assess. How much time do they have to read dozens of RSS feeds, research blogs, industry and government reports, security vendor reports, news websites, and GitHub repositories? Collecting and making sense of all that data becomes crucial, but there’s no way individuals can do this on their own quickly enough. Being able to automate that process so you can get to the information that you are going to use now or later and filter out the noise is essential. Obviously, automation is a fundamental capability to reduce the burden of manual review and prioritization of alerts. But while a recent report on cybersecurity automation adoption finds that confidence in automation is rising, only 18% of respondents are applying automation to alert triage. Automation can also help mitigate risk from vulnerabilities in legacy systems. 


The Board’s Role in Advancing Digital Trust

Boards have reported the use of FAIR provides an organized means through which to identify the value of assets, the design of probable loss scenarios, and providing a means to allocate capital to get the most bang for the buck. Some boards have reported FAIR has been useful in demonstrating to objective third parties, like regulators, that they have been prudent in managing digital risk. Reputational loss is widely considered the largest impact from a cyber incident. Unfortunately, reliable statistics are not yet available, but anecdotal evidence supports the popular belief. ... The ability of the board to appropriately ensure the company has the proper level of cyber resilience requires an understanding of an adverse event but also the total cost of controls, ranging from the mundane to worst-case scenarios. Scenario-based exercises combined with CRQ techniques, like FAIR, provide an objective means to assess materiality, and the most appropriate capital allocation. The allocation of too little capital leaves you exposed, while too much wastes capital that can be better applied other places. Determining the cost of a control is almost always the easy part.


Top employee cybersecurity tips for remote work and travel

Trip or no trip, lock your SIM card. SIM-jacking (or SIM-swapping, unauthorized port-out or “slamming”) is a real and underreported crime where threat actors pretend to be you, contact your wireless provider and “port over” your SIM card to your (their) “new phone.” Imagine someone stealing your entire online life, including your social media accounts. In other words, your phone number is now theirs. All your password resets now run through the threat actor. Considering how many work credentials, social media accounts and apps run through your phone number, the nightmare of this crime quickly becomes evident. If you haven’t already done so, lock down your SIM card with your wireless provider. ... Use two-factor authentication (2FA) everywhere and with everything. When choosing how to receive the authentication code, always opt for token over text as it’s much more secure. At Black Hat 2022, a Swedish research team demonstrated exactly how insecure text authentications are. If a hacker has your login credentials and phone number, text-based authentication simply won’t protect you.


AI accountability held back by ‘audit-washing’ practices

Published under the GMF think-tank’s Digital Innovation and Democracy Initiative, the report said that while algorithmic audits can help correct for the opacity of AI systems, poorly designed or executed audits are at best meaningless, and at worst can deflect attention from, or even excuse, the harms they are supposed to mitigate. This is otherwise known as “audit-washing”, and the report said many of the tech industry’s current auditing practices provide false assurance because companies are either conducting their own self-assessments or, when there are outside checks, are still assessed according to their own goals rather than conformity to third-party standards. “If well-designed and implemented, audits can abet transparency and explainability,” said the report. “They can make visible aspects of system construction and operation that would otherwise be hidden. Audits can also substitute for transparency and explainability. Instead of relying on those who develop and deploy algorithmic systems to explain or disclose, auditors investigate the systems themselves.


7 dos and don’ts for working with offshore agile teams

Many companies create business continuity plans to manage a crisis around key business operations. But these plans may overlook specifics for small offshore development teams or not account for intermittent disruptions to internet, power, or other resources that impact an offshore team’s safety, health, or productivity. “If you’re working with a global, distributed team, you need to accept the responsibilities that come with supporting your workforce—whether they are across the world or seated two desks away,” says Andrew Amann, CEO of NineTwoThree Venture Studio. “This means having a plan in place for when a global crisis limits your team members’ ability to work.” Amann offers several recommendations for developing a practical plan. “Cross-train employees, build relationships with development agencies, plan for difficulties with offshore payments, and make sure you stand behind your distributed teams when they need help,” he says.


Almost half of customers have left a vendor due to poor digital trust: Report

The road to digital trust is not always smooth sailing. The number one IT challenge cited was managing digital certificates, rated as important by 100% of enterprises, while regulatory compliance and handling the massive scope of what they are protecting was deemed important by 99% of respondents. Other challenges cited in the research included the difficulty of securing a complex dynamic, multivendor network, and a lack of staff expertise. The report also p oint sout that many common security practices have yet to be implemented. ... For companies still looking for ways to improve digital trust, DigiCert recommends making it a strategic imperative securand recognizing the impact it has on business outcomes such as customer loyalty and revenue. DigiCert said it’s also important to remember that digital trust awareness is rising among users and customers, meaning that your business success and reputation are directly tied an organization’s ability to ensure digital trust at a high level.


A far-sighted approach to machine learning

The researchers focused on a problem known as multiagent reinforcement learning. Reinforcement learning is a form of machine learning in which an AI agent learns by trial and error. Researchers give the agent a reward for “good” behaviors that help it achieve a goal. The agent adapts its behavior to maximize that reward until it eventually becomes an expert at a task. But when many cooperative or competing agents are simultaneously learning, things become increasingly complex. As agents consider more future steps of their fellow agents, and how their own behavior influences others, the problem soon requires far too much computational power to solve efficiently. This is why other approaches only focus on the short term. “The AIs really want to think about the end of the game, but they don’t know when the game will end. They need to think about how to keep adapting their behavior into infinity so they can win at some far time in the future. Our paper essentially proposes a new objective that enables an AI to think about infinity,” says Kim.


Demand for IT pros remains high even as layoffs continue

Even as layoffs continue, unemployment in the tech sector has remained at near-historic lows, hovering around 2.2%. That compares with the overall US unemployment rate of 3.7% as of October. So far this year, tech industry employment has increased by 193,900 jobs, 28% higher than the same period in 2021, according to a jobs report from CompTIA, a nonprofit association for the IT industry and workforce. “Tech hiring activity remains steady, but there are undoubtedly concerns of a slowing economy,” CompTIA CEO Tim Herbert said in a statement. While November’s job data is not expected to be as robust as the same period a year earlier (when 73,600 jobs were added), the overall projection is that it will remain at a status quo level, with hiring continuing at the same rate as in the last two quarters. “All-in-all, experienced IT Professionals will be in high demand,” Janco said. “Especially those who exhibit a strong work ethic and are results-oriented. Positions that will be in low demand will be administrative and non-line supervisors and managers.”


What I Learned in My First 6 Months as a Director of Data Science

The FAANG companies (Facebook-Apple-Amazon-Netflix-Google) can afford to pay amazing salaries. But most companies hiring data scientists are not like that. Don’t get me wrong! Data scientists still can make a very decent living! But in the world of actual practicing, non-tech data scientists, things are much more realistic. Unfortunately though, it means I am competing for talent against the FAANG companies. As such I have had to get very creative in where I advertise my postings and do my recruiting. Data scientists will always look for jobs at the FAANG companies, but they don’t always think about non-tech companies as employing data scientists. So this means I have learned that I have to be much more proactive in marketing my open roles. LinkedIn is great and recruiters can be helpful. However, I have also found great success in recruiting in unusual online forums — places like Discord, Slack, and Twitter. But make no mistake: recruiting data scientists is a full-contact sport! It is messy. You have to move quickly.


Five Key Components of an Application Security Program

Once an application architecture and design are defined, security risk assessments should be performed that identify and categorize the inherent security risk of the planned application architecture and the application’s expected functional capabilities. These assessments should be inclusive of types of data, business processes, third-party systems and platforms, and/or information infrastructure with which the application will interact and/or to and from which it will store, process, and transmit data. By gaining insight into inherent security risk, appropriate security control objectives and associated security controls can be defined to manage risk appropriately within the applications. Controls can include, but are not limited to, the use of web application firewalls (WAFs) and application program interface (API) security gateways, encryption capabilities, authentication and secrets management, logging requirements, and other security controls. The identification of security instrumentation requirements should also be included in the architecture and design stage of application development. 



Quote for the day:

"Problem-solving leaders have one thing in common: a faith that there's always a better way." -- Gerald M. Weinberg

Daily Tech Digest - November 25, 2022

Ripe For Disruption: Artificial Intelligence Advances Deeper Into Healthcare

The challenges and changes needed to advance AI go well beyond technology considerations. “With data and AI entering in healthcare, we are dealing with an in-depth cultural change, that will not happen overnight,” according to Pierron-Perlès at her co-authors. “Many organizations are developing their own acculturation initiatives to develop the data and AI literacy of their resources in formats that are appealing. AI goes far beyond technical considerations.” There has been great concern about too much AI de-humanizing healthcare. But, once carefully considered and planned, may prove to augment human care. “People, including providers, imagine AI will be cold and calculating without consideration for patients,” says Garg. “Actually, AI-powered automation for healthcare operations frees clinicians and others from the menial, manual tasks that prevent them from focusing all their attention on patient care. While other AI-based products can predict events, the most impactful are incorporated into workflows in order to resolve issues and drive action by frontline users.”


Extinguishing IT Team Burnout Through Mindfulness and Unstructured Time

Mindfulness is fundamentally about awareness. For it to grow, begin by observing your mental state of mind, especially when you find yourself in a stressful situation. Instead of fighting emotions, observe your mental state as those negative ones arise. Think about how you’d conduct a deep root cause analysis on an incident and apply that same rigor to yourself. The key to mindfulness is paying attention to your reaction to events without judgment. This can unlock a new way of thinking because it accepts your reaction, while still enabling you to do what is required for the job. This contrasts being stuck behind frustration or avoiding new work as it rolls in. ... Mindfulness is an individual pursuit, while creativity is an enterprise pursuit, and providing space for employees to be creative is another key to preventing burnout. But there are other benefits as well. There is a direct correlation between creativity and productivity. Teams that spend all their time working on specific processes and problems struggle to develop creative solutions that could move a company forward. 


Overcoming the Four Biggest Barriers to Machine Learning Adoption

The first hurdles with adopting AI and ML are experienced by certain businesses even before they begin. Machine learning is a vast field that pervades most aspects of AI. It paves the way for a wide range of potential applications, from advanced data analytics and computer vision to Natural Language Processing (NLP) and Intelligent Process Automation (IPA). A general rule of thumb for selecting a suitable ML use case is to “follow the money” in addition to the usual recommendations on framing the business goals – what companies expect Machine Learning to do for their business, like improving products or services, improving operational efficiency, and mitigating risk. ... The biggest obstacle to deploying AI-related technologies is corporate culture. Top management is often reluctant to take investment risks, and employees worry about losing their jobs. Businesses must start with small-scale ML use cases that demand realistic investments to achieve quick wins and persuade executives in order to assure stakeholder and employee buy-in. By providing workshops, corporate training, and other incentives, they can promote innovation and digital literacy.


Fixing Metadata’s Bad Definition

A bad definition has practical implications. It makes misunderstandings much more likely, which can infect important processes such as data governance and data modeling. Thinking about this became an annoying itch that I couldn’t scratch. What follows is my thought process working toward a better understanding of metadata and its role in today’s data landscape. The problem starts with language. Our lexicon hasn’t kept up with modern data’s complexity and nuance. There are three main issues with our current discourse about metadata: Vague language - We talk about data in terms of “data” or “metadata”. But one category encompasses the other, which makes it very difficult to differentiate between them. These broad, self-referencing terms leave the door open to being interpreted differently by different people. A gap in data taxonomy - We don’t have a name for the category of data that metadata describes, which creates a gap at the top of our data taxonomy. We need to fill it with a name for the data that metadata refers to. Metadata is contextual - The same data set can be both metadata and not metadata depending on the context. So we need to treat metadata as a role that data can play rather than a fixed category.


Addressing Privacy Challenges in Retail Media Networks

The top reason that consumers cite for mistrusting how companies handle their data is a lack of transparency. Customers know at this point that companies are collecting their data. And many of these customers won’t mind that you’re doing it, as long as you’re upfront about your intentions and give them a clear choice about whether they consent to have their data collected and shared. What’s more, recent privacy laws have increased the need for companies to shore up data security or face the consequences. In the European Union, there’s the General Data Protection Regulation (GDPR). In the U.S., laws vary by state, but California currently has the most restrictive policies thanks to the California Consumer Protection Act (CCPA). Companies that have run afoul of these laws have incurred fines as big as $800 million. Clearly, online retailers that already have — or are considering implementing — a retail media network should take notice and reduce their reliance on third-party data sources that may cause trouble from a compliance standpoint.


For Gaming Companies, Cybersecurity Has Become a Major Value Proposition

Like any other vertical industry, games companies are tasked with protecting their organizations from all nature of cybersecurity threats to their business. Many of them are large enterprises with the same concerns for the protection of internal systems, financial platforms, and employee endpoints as any other firm. "Gaming companies have the same responsibility as any other organization to protect customer privacy and preserve shareholder value. While not specifically regulated like hospitals or critical infrastructure, they must comply with laws like GDPR and CaCPA," explains Craig Burland, CISO for Inversion6, a managed security service provider and fractional CISO firm. "Threats to gaming companies also follow similar trends seen in other segments of the economy — intellectual property (IP) theft, credential theft, and ransomware." IP issues are heightened for these firms, like many in the broader entertainment category, as content leaks for highly anticipated new games or updates can give a brand a black eye at best, and at worst hit them more directly in the financials. 


Driving value from data lake and warehouse modernisation

To achieve this, Data Lakes and Data Warehouses need to grow alongside the business requirements in order to be kept efficient and up to date. Go Reply is a leading Google Cloud Platform Service integrator (SI) that is helping companies that span multiple sectors along this vital journey. Part of the Reply Group, Go Reply is a Google Cloud Premier Partner focussing on areas to include Cloud Strategy and Migration; Big Data; Machine Learning; and Compliance. With Data Modernisation capabilities in the GCP environment constantly evolving, businesses can become overwhelmed and unsure on not only next steps, but more importantly next steps for them, particularly if they don’t have in-house Google expertise. Companies often need to utilise both Data Lakes and Data Warehouses simultaneously so guidance on how to do this, as well as driving value from both kinds of storage is vital. When speaking to the Go Reply leadership team they advise that Google Cloud Platform being the hyperscale cloud of choice for these workloads, brings technology around Data Lake, and Data Warehouse efficiency, along with security superior to other market offerings.


Three tech trends on the verge of a breakthrough in 2023

The second big trend is around virtual reality, augmented reality and the metaverse. Big tech has been spending big here, and there are some suggestions that the basic technology is reaching a tipping point, even if the broader metaverse business models are, at best, still in flux. Headset technologies are starting to coalesce and the software is getting easier to use. But the biggest issue is that consumer interest and trust is still low, if only because the science fiction writers got there long ago with their dystopian view of a headset future. Building that consumer trust and explaining why people might want to engage is just as a high a priority as the technology itself. One technology trend that's perhaps closer, even though we can't see it, is ambient computing. The concept has been around for decades: the idea is that we don't need to carry tech with us because the intelligence is built into the world around us, from smart speakers to smart homes. Ambient computing is designed to vanish into the environment around us – which is perhaps why it's a trend that has remained invisible to many, at least until now.


CIOs beware: IT teams are changing

The role of IT is shifting to be more strategy-oriented, innovative, and proactive. No longer can days be spent responding to issues – instead, issues must be addressed before they impact employees, and solutions should be developed to ensure they don’t return. What does this look like? Rather than waiting for an employee to flag an issue within their system – such as recurring issues with connectivity, slow computer start time, etc. – IT can identify potential threats to workflows before they happen. They plug the holes, then they establish a strategy and framework to avoid the problem entirely in the future. In short, IT plays a critical role in successful workplace flow in both a proactive and reactive way. For those looking to start a career in IT, the onus falls on them to make suggestions and changes that look holistically at the organization and how employees interact within it. IT teams are making themselves strategic assets by thinking through how to make things more efficient and cost-effective in the long term.


A Comprehensive List of Agile Methodologies and How They Work

Extreme Programming (or XP), offers some of the best buffers against unexpected changes or late-stage customer demands. Within sprints and from the start of the business process development, feedback gathering takes place. It’s this feedback that informs everything. This means the entire team becomes accustomed to a culture of pivoting on real-world client demands and outcomes that would otherwise threaten to derail a project and seriously warp lead time production. Any organization with a client-based focus will understand the tightrope that can exist between external demands and internal resources. Continuously orienting those resources based on external demands as they appear is the single most efficient way to achieve harmony. This is something that XP does organically once integrated into your development culture. ,,. Trimming the fat from the development process is what this method is all about. If something doesn’t add immediate value, or tasks within tasks seem to be piling up, the laser focus of Lean Development steps in.



Quote for the day:

"Confident and courageous leaders have no problems pointing out their own weaknesses and ignorance. " -- Thom S. Rainer