“It is probably the first time in history that more than 25% of all IT roles being hired for are in information security,” said Suvarna Ghosh, Founding Partner, Maxima Group. “Earlier, the number of such roles would barely constitute less than 10% of all IT hirings.” Companies are focusing on hiring security specialists in the IT role as the focus is on augmenting the operational effectiveness of the team. “Companies are also introducing some new roles for CISO and assistant CISO positions. New roles are also being created because the focus is on ensuring that information security vulnerabilities can be better dealt with,” Ghosh adds. The typical criteria is an experience in security and network domain and the ability to ensure that the virtual workspace infrastructure is secure. ... While there is a surge in demand for CISOs and CDOs, CIOs may not be benefited from the trend. In fact, the surge in demand for the two roles could come at the expense of CIOs. The demand for CTO and other IT roles has gone down from 93% pre-Covid to 74% now, according to a survey conducted by Maxima Group whose clientele includes MNCs and VC-funded startups.
What is now critical is the value of co-development and co-innovation, as not all new market entrants need to be seen as competitors. The pace of change today means it is impossible to keep innovation in-house. Instead, incumbents are already starting to collaborate with fintechs and other third-party technology providers. This allows them to benefit from the expertise of companies which specialise in the innovative technologies that will help them compete, but which would be too expensive and take too long to develop internally. A cloud-enabled, platform-based approach is the launchpad from which banks and fintechs can collaborate and innovate more easily – creating new value and driving new market opportunities in an Open Banking world. Additionally, in a partnership ecosystem a cloud-based environment allows banks to safely test and explore partnerships – and to help regulators encourage innovation whilst ensuring consumer protection. The current situation has brought into sharp focus the need for established banks to learn from the digital disruptors.
Data lakes serve as vast collections of raw enterprise data. Using them involves migrating or copying raw data — structured, unstructured and semi-structured — as well as transformed data for specific purposes, such as analytics, visualization and reporting. Data lakes once held the promise of making it easier to ingest, combine and analyze diverse data in service of machine learning and artificial intelligence (AI) efforts. In reality, as data lakes host more and more data, it becomes difficult for users to know what's in them and how the data is connected. In essence, data lakes are raw data in a large distributed file system like the one on your PC. So, the whole enterprise spends as much time looking for stuff in the data lake as we do individually our laptops. You may not realize it, but you're probably already familiar with knowledge graphs since they're used extensively by Google, Facebook, Apple, LinkedIn, Uber and many others. Knowledge graphs connect data based on what it means, without duplicating or copying the data. This effectively allows companies to act on their data as if silos don't exist! It also carries the benefit of avoiding complex ETL (extract, transform, load) jobs and saving money on cloud hosting costs of duplicated data.
Conceivably, if the component containing 5G radio technology were based on a completely open plan -- as much a "white box" for developers and engineers as the "clone" PCs of the 1980s -- then the barrier to entry for other players to get involved and become competitive, could be knocked down. The threat of vendor lock-in for any operator contracting with Huawei, Ericsson, or Nokia would evaporate, as Cisco, Samsung, and potentially others such as Qualcomm and Intel, could suddenly become options. Perhaps the name of the startup entering 5G's new open ring has yet to be concocted. Among the O-RAN Alliance's members are China Mobile, AT&T, Verizon, NTT DoCoMo, Sprint, Cisco, Dell Technologies, Facebook, Microsoft, Nokia, Ericsson, IBM, Intel, and ZTE. China's interests are well represented, along with America's and Europe's. Huawei, however, has been an active opponent of O-RAN, arguing since well before the pandemic that while cheaper up-front, a white box would be more expensive and difficult to maintain over time, and would never perform as well as its own components.
First and foremost, patching needs to be under control. Many businesses struggle with this, especially with third-party patches for Java and Adobe products, and hackers love this. Until software updates are deployed in a timely fashion, the organization is a sitting duck. A network is just one click away from compromise. Effective malware protection is also a necessity. Steer away from the traditional and look more toward advanced malware tools including non-signature/cloud-based antivirus, whitelisting and network traffic monitoring/blocking technologies. Data backups are critical. Organizations' systems -- especially the servers that are at risk to ransomware infections -- are only as good as their last backup. Discussions around backups are boring, but they need to be well-thought-out to minimize the impact of the ransomware that does get through and encrypts critical assets. Network segmentation is another important part of ransomware protection, but it's only sometimes deployed properly.
From a machine learning perspective, the feature vectors necessary for supervised learning are formed of the metadata about commits (e.g. number of files or the average age of files, to name just two). This is taken from the source control system. The class we want to predict is the result of a test, taken from the test execution data. We bundle these data by test case, meaning we get one model for each test case and ML algorithm we evaluate. Features arise from domain knowledge and intuition, e.g. "larger code changes can break more things". At this point, the features are just descriptions of a commit, though. Whether they can explain a certain test case’s results depends on the other features present, the total training data, and the algorithms used for learning, and is reviewed by evaluating the machine learning system with separate test and training data. ... One interesting thing to note is the different questions we get when presenting the system at conferences. At a tester-focused conference, questions were (as expected) test case focused, e.g. "Can we judge a test case’s quality by whether it is easily predictable or not".
While it is a given that agile development has become the norm for software development, true business agility requires more than having scrum teams delivering working solutions. Moreover, if you only focus on the small-scale kind of agility provided by agile software development, you might miss the forest for the trees: why do you want to be agile as an enterprise, and what does that require? An enterprise is more than just a bunch of local developments by small teams. The pieces of the puzzle that these teams work on must fit together somehow. And hopefully there is a vision of the future, aligned with organization strategy, a set of goals that the organization aims for. That is where Enterprise Architecture enters the stage. Both approaches have their merits and shortcomings. EA without agile may lead to slow and bureaucratic organizations that does not respond fast enough to changes and trends, and only having a horde of scrum teams without some integrative, overarching approach may lead to a disconnected landscape consisting of agile silos. However, if we build on the strengths of both approaches, we can create enterprises that move as a united whole without having a central, command-and-control management that stifles local development and innovation.
Although there is some crossover, there are stark differences between data architecture and enterprise architecture (EA). That’s because data architecture is actually an offshoot of enterprise architecture. In simple terms, EA provides a holistic, enterprise wide overview of an organization’s assets and processes, whereas data architecture gets into the nitty gritty. The difference between data architecture and enterprise architecture can be represented with the Zachman Framework. The Zachman Framework is an enterprise architecture framework that provides a formalized view of an enterprise across two dimensions. ... Good data leads to better understanding and ultimately better decision-making. Those organizations that can find ways to extract data and use it to their advantage will be successful. However, we really need to understand what data we have, what it means, and where it is located. Without this understanding, data can proliferate and become more of a risk to the business than a benefit.
We've been chasing the dream of single sign-on (SSO) for as long as I can remember. Some customers believe they can achieve this by choosing the "right" federation (STS) provider. Azure AD can help significantly to enable SSO capabilities, but no STS is magical. There are too many "legacy" authentication methods which are still used for critical applications. Extending Azure AD with partner solutions can address many of these scenarios. SSO is a strategy and a journey. You can't get there without moving towards standards for applications. Related to this topic is a journey to passwordless authentication which also does not have a magical answer. Multi-factor authentication (MFA) is essential today. Add to it user behavior analytics and you have a solution which prevents the majority of common cyber-attacks. Even consumer services are moving to require MFA. Yet, I still meet with many customers who do not want to move to modern authentication approaches. The biggest argument I hear is that it will impact users and legacy applications. Sometimes a good kick might help customers move along - Exchange Online announced changes. Lots of Azure AD reports are now available to help customers with this
Data governance problems also arise with how to illustrate and represent the standardized data to the public. There are several techniques that can be used to create a graph to represent the scale of the number of new cases by country. Two common types are a linear scale and a logarithmic scale. When there are numbers that are skewed towards high values, a logarithmic scale is preferable to show the change in rate in the number of new cases over time. The logarithmic scale is extremely useful in displaying large data, but it may not be understood by all readers. Without the proper labeling in this graph of data, a reader can be misled to believe that this is a linear scale. For example, in the logarithmic scale below of COVID-19 cases by different countries, the large number confirmed cases in the United States can be graphed with the smaller number of confirmed cases in Japan and South Korea. A linear scale is most often assumed for a line graph since it is the first method learned by students in primary school.
Quote for the day:
"The real problem is not whether machines think but whether men do." -- B. F. Skinner