Daily Tech Digest - April 15, 2023

6 best practices to develop a corporate use policy for generative AI

The first step to craft your corporate use policy is to consider the scope. For example, will this cover all forms of AI or just generative AI? Focusing on generative AI may be a useful approach since it addresses large language models (LLMs), including ChatGPT, without having to boil the ocean across the AI universe. ... Involve all relevant stakeholders across your organization – This may include HR, legal, sales, marketing, business development, operations, and IT. Each group may see different use cases and different ramifications of how the content may be used or mis-used. Involving IT and innovation groups can help show that the policy isn’t just a clamp-down from a risk management perspective, but a balanced set of recommendations that seek to maximize productive use and business benefit while at the same time manage business risk. Consider how generative AI is used now and may be used in the future – Working with all stakeholders, itemize all your internal and external use cases that are being applied today, and those envisioned for the future.


There Is No Resilience without Chaos

Chaos engineering has emerged as an increasingly essential process to maintain reliability for applications — or in not only cloud native but any IT environment. Unlike pre-production testing, chaos engineering involves determining when and how software might break in production by testing it in a non-production scenario. In this way, chaos engineering becomes an essential way to prevent outages long before they happen. ... Chaos engineering, when done properly, requires observability. Problems and issues that can cause outages and the greater performance can be detected well ahead of time as bugs, poor performance, security vulnerabilities, etc. become manifest during a proper chaos engineering experiment. Once these bugs and kinks that can potentially lead to outages if left unheeded are detected and resolved, true continued resiliency in DevOps can be achieved. In the event of a failure, the SRE or operations person seeking the source of error is often overloaded with information.


Data Governance: Simple and Practical

Purpose-driven data governance programs narrow their focus to deliver urgent business needs and defer much of the rest, with a couple caveats. First, data governance programs are doomed to fail without senior executive buy-in and continuous engagement of key stakeholders. Without them, no purpose can be fulfilled. Second, data governance programs must identify and gain commitment from relevant (but perhaps not all) data owners and stewards, but that doesn’t necessarily mean roles and responsibilities need to be fully fleshed out right away. Identify the primary purpose then focus on it – sounds like a simple formula, but it’s not obvious. Many data governance leaders are quick to define and pursue their three practices or five pillars or seven elements, and why shouldn’t they? They need those capabilities, but wanting it all comes at the sacrifice of getting it now. Generate business value with your primary purpose before expanding. ... An insurer explained to me their dashboards weren’t always refreshed, and when they were, wide fluctuations in values made it impossible to make informed decisions.


What is platform engineering? Evolving devops

The developer portal is the main mechanism and expression of platform engineering. Its main purpose is to gather together the developer's tooling, documentation, and interactivity in one place. It is a kind of front end to the organization's developer infrastructure. Developer portals (aka internal developer platforms) have evolved out of several needs and trends. This primer on developer portals delineates these tools into three types: universal service catalog, API catalog tied to API gateway, and microservices catalog. APIs figure large in platform engineering because the uptake of microservices architecture has caused a great deal of increased complexity for modern software teams. Orchestrating microservices in a large organization can be very challenging. Just understanding what microservices are involved in a given use case can be difficult. A developer portal offers a unified view into the overall web of microservices. Another aspect of the developer portal is offering a standard framework to combine the tools used by the organization.


EU privacy regulators to create task force to investigate ChatGPT

In a statement posted on its website, the EDPB said the task force was intended to “foster cooperation and to exchange information on possible enforcement actions conducted by data protection authorities.” Last month, Italy’s data privacy regulator issued a temporary ban against ChatGPT over alleged privacy violations relating to the chatbot’s collection and storage of personal data. Italy's guarantor for the protection of personal data ordered the temporary halt on the processing of Italian users’ data by ChatGPT’s parent firm OpenAI, unless it complied with EU privacy laws. In order to have the service reinstated, the Italian guarantor outlined a list of data protection requirements that OpenAI must comply with, including increased transparency into how ChatGPT processes data, the right for nonusers to opt out of having their data processed, and an age-gating system for signing up to the service. In the wake of the ban, OpenAI CEO Sam Altman tweeted: “We of course defer to the Italian government and have ceased offering ChatGPT in Italy (though we think we are following all privacy laws).”


The mechanics of entrepreneurship

Lidow codifies this innovative shove, arguing that entrepreneurs invent and create enduring change in one of three ways: by scaling supply, scaling demand, or scaling simplicity. The first category includes those who scaled up their supply by devising an efficient system and then repeating it. In the late 1700s, the enterprising coin-maker Matthew Boulton, for example, leveraged his superior knowledge of metalworking to create a new process for producing coins quickly and uniformly—spawning countless societal changes. This included the swarm of entrepreneurs in the early- to mid-1800s who conceived the modern railway. Titans of the second category, scaled demand, include cultivators of desire like Wedgwood, Selfridge, and the American PR pioneer Edward Bernays, who coined the phrase “public relations” and created the industry. Through carefully cultivated propaganda campaigns, Bernays convinced wide swaths of folks in the US to support the country’s efforts in World War I and, later, stimulated broad demand for products such as bacon and tobacco.


3 IT leadership mistakes to avoid

The first exercise we undertook was brainstorming and agreeing on a set of operating principles, such as all ideas would be respected regardless of which side they came from; facts and data—not emotion—would drive decision-making; and creating a positive client experience would be our collective North Star. These principles became our rallying cry and helped lead the team to a very successful client conversion. Contrast that with leaders who set rigid rules for their teams to follow. Leading by a set of hard rules will limit innovation, hinder individual and team development, and create a constant need to add or modify the rules as situations change. ... There is no such thing as a perfect organizational structure—there’s only an array of alternatives, each with its own respective strengths and weaknesses. The only way to make an inherently flawed organizational structure work is to have individuals collaborate under a common strategy, purpose, and shared goals. Great teams also take individuals who are willing to sacrifice for the good of the whole.


Data leader Tejasvi Addagada on the value of data governance

If data is siloed, it cannot be used for developing insights and products. For an organization that is yet to invest in managing its data and thinks centralization is costly or a bottleneck, a data mesh architecture is a decentralized approach at its core, with its domain team ingesting its operational and analytical data and developing data products. ... From the initial concept of corporate governance, IT governance has evolved into the recent concept of data governance. Globally, the adoption of cloud services, the evolution of modern data stacks, and improved data literacy have led to a greater interest in governing data over the past years. Implementing data governance is necessary to get sustainable value from data. A subfunction can be formalized as an authorized provisioning service. It can support activities that help ensure that a data element can be rightfully sourced from a designated provisioning point. In addition, it can have the domain team express their trust in certifying data as a system of record as well as authorized to provision.


Google Cloud Unveils AI Tools to Streamline Preauthorizations

“The Claims Acceleration Suite’s Claims Data Activator uses Document AI, Healthcare Natural Language API, and Healthcare API to convert this unstructured data to structured data and establish data interoperability,” Waldron says. “This speeds up the process, and significantly reduces administrative burdens and costs, enabling experts to make faster, more informed decisions that improve patient care.” A quick prior authorization process is essential to speeding up the process for a patient who may need approval for transportation to an important medical procedure such as a colonoscopy, according to Waldron. Patients also seek prior authorizations to use a digital device as part of weight management or a care management plan for conditions such as diabetes. A goal of Google’s Claims Data Activator is to make healthcare prior authorization data more interoperable, or accessible for all parties. 


Data sharing between public and private is the answer to cybersecurity

Businesses and governments are already interlinked in their attempts to keep ahead of cybercriminals. You only need to look at examples of the recent Royal Mail attack, which saw the NCSC and the business working together to reduce its impact. And across the Pond, Biden’s newly announced Cybersecurity Strategy will focus on ensuring closer collaboration on cyber between government and industry. Whilst all of this is moving in the right direction in this regard, there’s more work to be done to create more intentional and systematic cross-sharing and learning from one another. To kickstart the open flow of knowledge in the industry, both public and private organizations could sponsor a wider peer network for security experts that streamlines intelligence from private to public or vice versa and offers support. Gartner offers a Peer Connect network of business leaders that encourages the open discussion of trends and ideas, critical to business decision-making. 



Quote for the day:

"A leader's dynamic does not come from special powers. It comes from a strong belief in a purpose and a willingness to express that conviction." -- Kouzes & Posner

Daily Tech Digest - April 07, 2023

Why leadership training fails — and how to fix it

It could be attributed to the existing culture of the leadership team. Do the organization’s leaders possess a growth mindset, or is intellectual humility lacking? Unless a leader wants to improve, it’s unlikely that they will. Leaders must be motivated to make the time and have the patience for the kind of reflective practice that makes learning stick. What about the group of high-potential employees, who have every intention of applying what they learn, but then struggle to translate the skills and knowledge into practice? It’s possible that the way the program is designed could be hindering successful learning transfer. One size leadership training does not fit all. To be effective, it must be designed with the learners’ needs in mind, whether they’re high-performing individual contributors without supervisory experience, or C-suite executives. If participants don’t find the content relevant to their role and objectives, learner engagement will suffer. Lastly, “leadership training” cannot be presented as a one-off event at the organization but rather, an ongoing process. Formal training is only one aspect of learning.


Delivery Leadership is both an Art and a Science

In the present time, with businesses becoming increasingly interconnected and globalized, enterprises worldwide seek modern technology products that are straightforward yet impactful in enhancing their operational efficiency, productivity, market penetration, and reducing operational expenses. These business requirements prompt organizations to explore technologies such as Cloud computing, ERP, AI, Data Analytics, Automation, and Business Intelligence. The provision of intricate IT Solutions and Services demands expertise and attributes from both the Art and Science aspects of the field. ... Delivery Leaders must have exposure to Industry, domain, and business knowledge to be successful in their role. They must place themselves in the shoes of customers and think about what value-add services their customers perceive to be important for their businesses. Proactive approach and futuristic thinking are the two most important skills a delivery leader must possess. They must also encourage a culture in which sharing prescriptive approaches and making business recommendations become the new norm. 


Asynchronous Patterns for Microservice Communication

Since Microservices communicate using Asynchronous methods, keeping their patterns fast and responsive is essential. Fortunately, there are several quick ways to do this. For example, having your services communicate asynchronously with RabbitMQ or Kestrel is a good idea before Synchronous methods. This way, you can maximize network efficiency while minimizing response delays. You can also use Kestrel’s retries for excellent reliability and scalability when communicating between machines. In addition, it’s a good idea to use event-driven communication for better responsiveness between your components and clients. If you want to connect with multiple microservices without creating dependencies or tightly coupling them, consider using asynchronous message-based communication in your microservices architecture. This approach leverages events to facilitate communication between microservices, which is commonly referred to as event-driven communication.


GPT and the Future of High-Performance Computing and Big Data Analytics

The emergence of Generative Pre-trained Transformer (GPT) models has revolutionized the field of high-performance computing and big data analytics. GPT models are capable of learning from large datasets and producing highly accurate results with minimal effort. This has enabled organizations to quickly analyze large datasets and extract meaningful insights. GPT models have been successfully used in a variety of applications, such as natural language processing, image recognition, and machine translation. With the increasing availability of large datasets, GPT models are expected to become even more powerful and efficient. This will enable organizations to gain deeper insights into their data and make better decisions. In addition, GPT models can be used to speed up the development of high-performance computing systems. GPT models can be used to optimize the hardware and software components of these systems, allowing them to run faster and more efficiently. 


Essential Soft Skills for Testers: Unlocking Success in Your Testing Career

Collaboration skills are essential for testers, as they often work closely with various team members, including developers, product managers, and other stakeholders, to ensure the delivery of high-quality products. In this section, we will explore three crucial collaboration skills that enable testers to be effective team players: active participation, cross-functional cooperation, and providing and receiving constructive feedback. Active participation refers to engaging fully in team activities, sharing ideas, and contributing meaningfully to discussions and decisions. ,,. Cross-functional cooperation is the ability to work effectively with team members from different departments or areas of expertise. Testers who excel in cross-functional cooperation can effectively communicate with developers, designers, product managers, and others to identify and resolve issues, share knowledge, and promote a shared understanding of project goals. This skill is particularly important for testers in agile environments, where cross-functional teams are the norm and effective cooperation is critical for delivering high-quality products on time.


What Engineers Need to Know About Using Agile for Digital Transformation

As Agile software development techniques became more widely applied, so the pace of technological change continued to quicken. From the emergence of the cloud to the increase in mobility, and onto the rise of data analytics and artificial intelligence, businesses in every sector began using IT systems to power internal processes and external services. Digital transformation has emerged as shorthand for businesses seeking to reinvent themselves on a foundation of digital data and technology. Whether it’s digitizing paper records, creating new electronic channels to market or analyzing data to produce new insights, companies can use technology to improve an existing business process. Agile development has played a crucial role in many of these digitalization programs, especially the creation of IT applications. The successful rollout of these software-focused projects has encouraged engineers to start thinking about how Agile techniques can be used in other areas of IT, including digital transformation initiatives.


How generative AI can hurt cloud operations

Generative AI algorithms can be incompatible with existing cloud computing systems, leading to integration issues. This can delay the deployment of generative AI algorithms and cause problems with system performance or efficiency. ... Generative AI algorithms can exhibit unpredictable behavior, which leads to unexpected outcomes. This can result in system errors, degraded system performance, and other issues that are impossible to predict. I suspect we’ll get better at predicting behavior as we learn more about generative AI system operations, but the learning curve will be painful. I’ve already had some generative AI systems pulled off cloud systems due to unpredictable behavior and, what’s worse, unpredictable cloud computing bills. Generative AI is an unstoppable force in the enterprise technology space. It’s yet another technology made more accessible and affordable by cloud computing, and the easy availability of this technology will reverberate through the marketplace. Generative AI will become a technology that allows businesses to succeed by out-innovating their competition.


Data, AI and automation will never replace humans. Fact

While these technologies are nothing new, they do continue to advance at pace. This presents the opportunity to leverage them to help solve some of the biggest challenges we face in society as well as in business. But, we will only succeed when we remain as the masters of the technology, not the servants. Using AI and automation to empower people, not replace them, allows organisations to be data-driven yet technology-enabled and people-centric. Where these pieces of software are used as tools to help humans do their best work and remove the drudgery of manual tasks. And it makes complete sense. Because there will always be a moment of truth when a human must be involved at a crucial point. An automated process might take someone 75 per cent of the way, but a person needs to complete the rest. And if they can put all their effort into that 25 per cent, the result will be a better outcome for the employee, the customer and the organisation that brings them together. Ultimately, those who try to remove people from the equation are destined to fail. 


How artificial intelligence can inform decision-making

To implement AI for decision-making, organizations need a modern data infrastructure to support new data types and often massive amounts of data. Many organizations are moving to the cloud for data management and making use of data engineers and newer pipeline tools to help integrate data and make sure it is trustworthy. They are also hiring DevOps teams to deploy models and monitor them in production. According to a TDWI Best Practices Report, 67 percent of organizations deploying AI technologies today state that AI projects are built by data scientists and are deployed into production by DevOps teams. Some organizations are also using augmented intelligence applications, where AI is infused into the software to automate functionality, such as data cleansing, deriving insights, or building predictive models. In addition to hiring specialists, organizations must also encourage all employees to build excitement and trust. It is essential to involve stakeholders in the design and implementation of AI systems to ensure that they understand how the systems work and are comfortable using them.

CDOs Want Increased Investments in Data Management, Cloud

“The challenge comes from managing the increased volume and variety of data, the need to integrate it into business intelligence, and most notably, the need to keep data infrastructure updated to ensure various data types are supported and organizations are following data compliance considerations,” he says. He explains CDOs are seeking greater investments in data management because they need to identify how they can create a measurable impact for the customer experience. “This can only be done by collecting and analyzing data, which can often be tedious and require large sums of time and resources,” Adya says. “Hence, many are doubling down on data management resources to get the job done quicker and easier.” As the digital ecosystem evolves, enterprises are being forced to innovate and rely on cloud to accelerate digital transformation. “Effectively leveraging data through the cloud gives organizations a competitive edge and increases resiliency by being able to respond to disruptions and spot new market opportunities through intelligent data,” he explains.



Quote for the day:

"To have long term success as a coach or in any position of leadership, you have to be obsessed in some way." -- Pat Riley

Daily Tech Digest - April 06, 2023

AI might not steal your job, but it could change it

People whose jobs deal with language could, unsurprisingly, be particularly affected by large language models like ChatGPT and GPT-4. Let’s take one example: lawyers. I’ve spent time over the past two weeks looking at the legal industry and how it’s likely to be affected by new AI models, and what I found is as much cause for optimism as for concern. The antiquated, slow-moving legal industry has been a candidate for technological disruption for some time. In an industry with a labor shortage and a need to deal with reams of complex documents, a technology that can quickly understand and summarize texts could be immensely useful. So how should we think about the impact these AI models might have on the legal industry? ... AI in law isn’t a new trend, though. It has already been used to review contracts and predict legal outcomes, and researchers have recently explored how AI might help get laws passed. Recently, consumer rights company DoNotPay considered arguing a case in court using an argument written by AI, known as the “robot lawyer,” delivered through an earpiece.


Microservice Architecture Key Concepts

Generally, you can use a message broker for asynchronous communication between services, though it’s important to use one that doesn’t add complexity to your system and possible latency if it doesn’t scale as the messages grow. Version your APIs: Keep an ongoing record of the attributes and changes you make to each of your services. “Whether you’re using REST API, gRPC, messaging…” wrote Sylvia Fronczak for OpsLevel, “the schema will change, and you need to be ready for that change.” A typical pattern is to embed the API (application programming interface) version into your data/schema and gracefully deprecate older data models. For example, for your service product information, the requestor can ask for a specific version, and that version will be indicated in the data returned as well. Less chit-chat, more performance: Synchronous communications create a lot of back and forth between services. If synchronous communication is really needed, this will work okay for a handful of microservices, but when dozens or even hundreds of microservices are in play, synchronous communication can bring scaling to a grinding halt.


'Silent Success': How to master the art of quiet hiring for your business

“Ironically, quiet hiring is neither ‘quiet’ nor is any ‘hiring’ involved in it in the traditional sense,” says Bensely Zachariah, Global Head of Human Resources at Fulcrum Digital, a business platform and digital engineering services company. Quiet hiring entails companies upskilling their existing employees and moving them to new roles or new sets of responsibilities, on a temporary or in some cases, permanent basis to meet the ever-evolving demands of the business environment. Zachariah says: “Quiet hiring is essentially the opposite of ‘quite quitting’, a buzzword during 2022, which, in simple words, means doing the bare minimum for what it takes to keep your job. The concept behind quiet hiring is rewarding high-performing individuals with more challenging roles, pay rises, bonuses, or promotions. This is not a new concept per se, in fact it is an age-old practice which was referred to as ‘facilitated talent mobility’ or ‘career advancement’ where organisations have spent considerable time and resources to facilitate upskilling/cross-skilling employees to give them new roles/avenues for work.”


AI and privacy concerns go hand in hand

Whether personal information is publicly available or not, its collection and use is still subject to the Privacy Act. While it’s on businesses to operate within the law, it pays for the public to upskill themselves and be savvy about what information they’re posting, and where. We know that criminals are becoming an even greater threat online because cybersecurity breaches are increasing and result in costly hacks of personal information. AI can be used to supercharge these criminals, leading to more privacy breaches, and making it even harder for cybersecurity systems to protect your information or for post-breach measures such as injunctions to protect stolen data that criminals may make available online. Powerful AI can aggregate data to a much greater degree, much more swiftly than humans can, meaning AI can potentially identify people that would otherwise not be identifiable through more time-intensive methods. Even seemingly benign online interactions could reveal more about you than you ever intended.


The Benefits of a Streaming Database

Experienced engineers understand that no software stack or tooling is perfect and comes with a series of trade-offs for each specific use case. With that in mind, let’s examine the particular trade-offs inherent to streaming databases to understand better the use cases they align best with. Incrementally updated materialized views – Streaming databases build on different dataflow paradigms that shift limitations elsewhere and efficiently handle incremental view maintenance on a broader SQL vocabulary. Other databases like Oracle, SQLServer and Redshift have varying levels of support for incrementally updating a materialized view. They could expand support, but will hit walls on fundamental issues of consistency and throughput. True streaming inputs – Because they are built on stream processors, streaming databases are optimized to individually process continuous streams of input data (e.g., messages from Kafka). Scaling streaming inputs involves batching them into larger transactions, slowing down data and losing granularity. In traditional databases (especially OLAP data warehouses), larger, less frequent batch updates are more performant.


6 steps to measure the business value of IT

A challenge for determining the value contribution is the selection of suitable key figures. According to the study, IT departments today primarily use technical and IT-related metrics. That is legitimate, but in this way, there’s no direct connection to the business. Plus, there’s often a lack of affinity for meaningful KPIs, both in IT and in the specialist departments, says Jürgen Stoffel, CIO at global reinsurer Hannover Re. Therefore, in practice, only a few metrics suitable for both sides would be found, and the result is the IT value proposition is often unseen. “A consistent portfolio of metrics coordinated with the business would be helpful,” says Thomas Kleine, CIO of Pfizer Germany, and Held from the University of Regensburg adds: “Companies have to get away from purely technical key figures and develop both quantitative and qualitative metrics with a business connection.” In order to make progress along this path, the consultants developed a process model with several development and evaluation phases, using current scientific findings and speaking to CIOs.


Strategic risk analysis is key to ensure customer trust in product, customer-facing app security

Assessing risk requires identifying baseline security criteria around key elements such as customer contracts and regulatory requirements, Neil Lappage, partner at LeadingEdgeCyber and ISACA member, tells CSO. “From the start, you've got things you’re committed to such as requirements in customer contracts and regulatory requirements and you have to work within those parameters. And you need to understand who your interested parties are, the stakes they've got in the game, and the security objectives.” The process of defining the risk profile of an organization also requires strong collaboration among IT, cybersecurity, and risk professionals. “How the organization knows the risk profile of the organization involves the cybersecurity team working with the IT and reporting to the business so these three things — cyber, IT and risk — work in unison,” he says. “If cyber sits isolated from the rest of the business, if it doesn't understand the business, the risk is not optimized.”


FBI Seizes Genesis Cybercriminal Marketplace in 'Operation Cookie Monster'

The seizure of Genesis was a collaborative effort between international law enforcement agencies and the private sector, according to the notice, which included the logos of European law enforcement agency Europol; Guardia Civil in Spain; Polisen, the police force in Sweden; and the Canadian government. The FBI also is seeking to speak those who've been active on the Genesis Market or who are in touch with administrators of the forum, offering an email address for people to contact the agency. ... Indeed, Genesis demonstrated the "growing professionalization and specialization of the cybercrime sphere," with the site earning money by gaining and maintaining access to victim systems until administrators could sell that access to other criminals, according to Sophos. The various tasks that the Genesis Market bots could undertake included large-scale infection of consumer devices to steal digital fingerprints, cookies, saved logins, and autofill-form data stored on them. The marketplace would package up that data and list it for sale, with prices ranging from less than $1 to $370, depending on the amount of embedded data that the packages contained.


Beyond Hype: How Quantum Computing Will Change Enterprise IT

“If you have a problem that can be put into an algorithm that leverages the parallelism of quantum computers, that’s where you can get a very dramatic potential speed up,” Lucero says. “If you have a problem that for every additional variable, you add to the problem, and doubles the computational complexity -- that is probably a good candidate to be adapted into a quantum computational problem.” The so-called “traveling salesperson problem,” for example, would be a fitting problem for a quantum computer. The algorithm asks the following: “Given a list of cities and the distances between each pair of cities, what is the shortest possible route that visits each city exactly once and returns to the origin city.” This and other combinatorial optimization problems are important to theoretical computer science because of the complexity of variations involved. Used as a benchmark, the algorithm can be applied to planning, logistics, microchip manufacturing and even DNA sequencing. In theory, a quantum computer could make quick work of this complicated algorithm and provide greater efficiency for programming.


How to build next-gen applications with serverless infrastructure

When explaining the benefits of serverless infrastructure and containers, I'm often asked why you need containers at all. Don't instances already provide isolation from underlying hardware? Yes, but containers provide other important benefits. Containers allow users to fully utilize virtual machine (VM) resources by hosting multiple applications (on distinct ports) on the same instance. As a result, engineering teams get portable runtime application environments with platform-independent capabilities. This allows engineers to build an application once and then deploy it anywhere, regardless of the underlying operating system. ... Implementing event-driven architecture (EDA) can work for serverless infrastructure through either a publisher/subscriber (pub/sub) model or an event-stream model. With the pub/sub model, notifications go out to all subscribers when events are published. Each subscriber can respond according to whatever data processing requirements are in place. On the event-stream model side, engineers set up consumers to read from a continuous flow of events from a producer. 



Quote for the day:

"When I finally got a management position, I found out how hard it is to lead and manage people." -- Guy Kawasaki

Daily Tech Digest - April 05, 2023

The evolving role of the CHRO in the post-pandemic era

A major landscape shift for CHROs is the advent of HR technologies that can be used as hiring as well as learning and development tools. By collecting and analysing critical HR data, companies can derive actionable insights to improve workforce, people, and talent management performance. Case in point, TDCX’s proprietary Flash Hire recruitment platform enables hiring managers to gauge a candidate’s competency and alignment with the job requirements more quickly through its artificial intelligence (AI) capabilities. ... Actionable insights can be combined with digital tools to boost employee productivity and encourage workers to further their career trajectories through learning and coaching for better outcomes. Additionally, digital tools can help organisations reach out and engage talent globally. While companies innovate their HR strategies by leveraging data analytics and digital tools, such efforts should also be balanced with a human touch. Many companies are already using technology such as robotic process automation and AI to improve hiring, boost employee productivity and encourage learning and career growth, and such instances will only continue to increase.


Three Trends in Enterprise Architecture to Optimize IT Operations

A shift to Continuous Delivery sees each team organize itself around an end-to-end value stream that brings ongoing and longer-term benefits to the business rather than short-term and time-bounded “wins” for individual departments. In addition, a Continuous Delivery approach aims to change the delivery model so that teams can work independently and with less direction rather than competing with one another for resources. ... In this model, each team holds responsibility for the design, operation, realization and future-state direction of their own semi-independent business area. Empowering teams in this way causes a shift in power control, making decision-making more democratic and less centralized. In addition, optimizing collaboration between teams brings about more active conversation and facilitation, resulting in a freer flow of information across the business. In shifting the Enterprise Architect role to a facilitator role than that of a data warden, the democratization of Enterprise Architecture significantly impacts the enterprise, evolving how it operates and the type of tooling it needs for success.


IT Leaders as Advocates for Continual Change

“All IT leaders should keep the concept of constant change top of mind, as it is essential for driving innovation, staying ahead of the curve, and responding quickly to new challenges and opportunities,” he says. Dennis Monner, chief commercial officer at Aryaka, explains businesses are coming to realize that digital transformation is a never-ending process. “As environments change and are on the verge of more change, IT leaders want to be well-prepared and be able to adapt,” he says. This means they need to be continuously evaluating their businesses, as well as different models of operation that may facilitate transformation more smoothly. “IT leaders must recognize when a particular approach may not be working as expected and when a pivot may be necessary,” he says. “Keeping an open mind to continual change means a business is putting the needs of its customers first, rather than forcing customers to adapt and settle for less than they deserve.” If a product or service isn’t resonating with customers, business leaders need to be willing to shake things up, as opposed to continuing down a path that will not yield results.


Top Digital Communication Skills You’ll Need For 2023

Cultural and social sensitivity is a significant part of d igital communication management. Understanding different cultural and social dynamics enables you to become a better business leader, and will help you to develop relevant communication strategies. This is why at BI students receive a grounding in psychological and sociological concepts underlying communication process. Modules such as ‘human interaction’ and ‘human-technology interaction’ educates students on different theories related to identity and diversity. Along with an extensive understanding of how people interact and communicate with each other through technology. ... To excel in a career management, it is vital that you have a clear understanding of the dynamics of communication. At BI, you will be introduced to solid theories and research which revolve around digital communication strategy, social media, network theory, and innovation and corporate storytelling. Our methods course covers the basics of qualitative and quantitative methods such as data handling, recoding, experimental statistics, and digital and social network data analysis.


How Unregulated Access and Advanced Processing Threaten Personal Privacy

Powerful tech companies like Google, Amazon, Meta, and Apple revolutionized data analytics and its use in the private sector. By combining widely collected user activity with behavioral psychology, commercial organizations alter choices that people make, often in the interest of profit maximization. While not illegal, such manipulation challenges the notions of individual choice in a liberal society. However, while processing capabilities have matured, the requirements to secure data during the collection, storage, and analysis phases grow increasingly complex. Companies face endless attempts to exploit networks and servers, keeping cyber security teams busy with ever-evolving threats. Preventing data breaches, such as the 2013-2015 exfiltration of personal information from the United States Office of Personnel Management (OPM), requires skilled technicians and powerful software working around the clock. Perhaps more concerningly, the proliferation of commercially-collected data has led to the rise of data brokers, who profit off the collection and sale of personal information within the current legal limitations of consumer data usage.


Should Every Incident Get a Retrospective?

Try to do this for every incident, and you’ll quickly run into practical limitations. Each incident can take tens of person-hours to study, and meanwhile, more incidents may stack up. At some point, there won’t be enough hours in the day, incident analysts to run investigations or money to spend. You’ll also soon exhaust your organization’s goodwill. Earlier in my career, I learned this lesson the hard way. My site reliability engineering (SRE) team had a keen interest in getting a handle on the growing number of production incidents in our cloud infrastructure. ... High-profile incidents as a whole should give you pause. These are exactly the kinds of incidents that we can learn the most from. However, tensions will run high, and even a company that normally embraces a blameless approach to retrospectives might look for someone to take the fall. Does your incident retrospective have the potential to cause harm to incident participants, or perhaps even termination? It might be best to focus on incidents that are less emotionally charged, especially at companies that are still early on the path toward blameless retrospectives.


UK data regulator issues warning over generative AI data protection concerns

“Organisations developing or using generative AI should be considering their data protection obligations from the outset, taking a data protection by design and by default approach,” said Stephen Almond, the ICO’s director of technology and innovation, in the post. Almond also said that, for organizations processing personal data for the purpose of developing generative AI, there are various questions they should ask themselves, centering on: what their lawful basis for processing personal data is; how they can mitigate security risks; and how they will respond to individual rights requests. “There really can be no excuse for getting the privacy implications of generative AI wrong,” Almond said, adding that ChatGPT itself recently told him that “generative AI, like any other technology, has the potential to pose risks to data privacy if not used responsibly.” “We’ll be working hard to make sure that organisations get it right,” Almond said. The ICO and the Italian data regulator are not the only ones to have recently raised concerns about the potential risk to the public that could be caused by generative AI.


The Data Trifecta: Privacy, Security, and Governance from Reactivity to Resilience

The panelists broadly agreed that bringing together privacy, security, and governance teams to develop a tech-driven model of unified engagement can revolutionize a business’s efficiency and effectiveness. In Sommer’s observations, this form of platform approach cuts down on the number of individual interactions between teams, while at the same time distilling the load of regulatory obligations, all of which makes operations more manageable to support. Streamlining this process allows managers to automate rules and controls into the data in a manner that lets the data flow, “which ultimately means that our privacy teams and security teams can say yes to more projects,” noted Prestidge. “Then our data teams can get access to that approved data much quicker, speeding up all of the processes and reducing the risk.” ... “At the end of the day, the intentions behind most privacy regulations tend to be focused around protecting the consumer,” reflected Sommer. It’s therefore essential for data teams to implement a degree of consistency between promises made to consumers in initial privacy disclosures and the ways in which data is used – and shared – in current operations.


Designing Tabletop Exercises That Actually Thwart Attacks

"The big thing that we want to shoot for in these tabletops is as much realism as we can possibly get," Durrin says. But realism is difficult to simulate. Davidoff jokes about how "we tried using ChatGPT to run a tabletop exercise," and it didn't turn out so well. "It's like: 'I am the facilitator,' and starts walking you through the steps. But it's very boring. It doesn't give you any curveballs." Simulating realism, ironically, requires a good deal of showmanship: storytelling, audio and visual materials, and a certain creativity to generate the chaos and unpredictability you'd find in a cyberattack in real life. But little of this theater is completely made-up. "We try to leverage the experience that we've gained over the years of actually dealing with these attacks in the wild," Durrin notes, ... Enterprises that run these kinds of simulations tend to avoid those mistakes. "We've actually been able to see how those changes that we've made and tested inside of an incident response plan have benefited organizations in a very tangible and real sense," Durrin says, "in the speed of recovery, the quality of recovery and how the organization is actually able to get back on their feet after suffering from an incident."


Attention to detail matters

What about O’Brien’s theory that caring about details is something that you’re born with? I think people can learn to apply more attention to detail. If you give people enough context about why they should care about the little things, they should make the effort to internalize the lesson. That must be what happens in work environments in which people’s lives are on the line, such as operating rooms and airplane cockpits. But what about in the office? In the many years I spent as an editor managing teams of reporters, I learned it was a good practice, when I moved into a role, to tell my new colleagues that I didn’t like corrections—the notes reporters add at the end of an online article that explain how an earlier version got some spelling or fact wrong. I didn’t say it in a threatening way. I just shared it as a fact about me. That approach sent a clear signal that I expected people to check their facts before their piece was published. So, on the nature-or-nurture question, the answer is probably a bit of both—but the advice would be to cultivate a detail-oriented approach if you sense you might not be born with one.



Quote for the day:

"Listening & trusting the inner voice is one of the most important lessons of leadership." -- Warren Bennis

Daily Tech Digest - April 03, 2023

From CDO to CTIO – what tech leader job titles really mean, and who calls the shots

Like CDOs, Chief Technology Officers are very much on the rise. Particularly in technology companies themselves and in start-ups, you are likely to find a CTO at the head of the ship rather than a CIO, and this is also the case in many legacy businesses who are aiming to transform themselves into technology-led organisations: the emphasis on technology in the title better reflects the wider brief of today’s digital leader. So, the CTO role is very much ‘of our times’. There again, the CTO could also be a number two to the CIO who leads all of technology, architecture and/or software engineering. ... Hybrid roles have been very much on the rise recently, especially at the top end. Instead of having ‘just’ a CIO or CDO or CTO, Chief Technology & Information Officers are becoming more common. It’s a role that combines accountability for IT with technology/product delivery. The CTIO will act as an internal advisor to the Board on information strategy, infrastructure and systems as well as being the leader in new product development with an eye to the future and a horizon-scanning mindset.


CIOs must evolve to stave off existential threat to their role

Already under pressure to accelerate digital transformation, CIOs now often find their voices drowned out by LOB executives who are heavily involved in making technology decisions, according to the report. This trend could leave CIOs vulnerable to decreased influence over the corporate technical agenda, or pushed into a secondary C-suite role. ... To ward off LOB heads from their turf, Linus Lai, chief analyst and digital business research lead at IDC A/NZ, says CIOs must be able to demonstrate to other members of the C-suite how their actions and decisions directly boost the bottom and top lines. CIOs should also build stakeholder relationships within LOBs and leverage business relationship managers to better serve customer-facing organizations. “CIOs will have to ensure effective joint business outcomes from IT and LOBs by delivering strategic digital business advice and enabling effective upwards communication. They must initiate a critical review of sourcing practices to manage the supplier ecosystem to maintain architectural goals and spending targets ...


These Experts Are Racing To Protect AI From Hackers

Concerns about attacks on AI are far from new but there is now a growing understanding of how deep-learning algorithms can be tricked by making slight -- but imperceptible -- changes, leading to a misclassification of what the algorithm is examining. ... "Data poisoning can be one of the most powerful threats and something that we should care a lot more about. At present, it doesn't require a sophisticated adversary to pull it off. If you can poison these models, and then they're used widely downstream, you multiply the impact -- and poisoning is very hard to detect and deal with once it's in the model," says Slater. If that algorithm is being trained in a closed environment, it should -- in theory -- be reasonably well protected from poisoning unless hackers can break in. But a bigger problem emerges when an AI is being trained on a dataset that is being drawn from the public domain, especially if people know this is the case. Because there are people out there -- either through a desire to cause damage, or just to cause trouble -- who will try to poison the algorithm.


5 strategies to manage cybersecurity risks in mergers and acquisitions

In tech deals where technology is the target’s product or an important part of it, cybersecurity is a particular focus, said Philip Odence, general manager of Black Duck Audit Business at Synopsys, who specializes in due diligence in M&A transactions. As such, the acquiring company must determine if the target company has designed security into its software. If not, the acquiring company is buying into a bunch of unplanned future remediation work to address, he says. “As excessive problems will mean a heightened chance of getting breached, the buyer might want some portion of funds to be escrowed against such an eventuality,” Odence says. “It’s also not highly unusual for valuation to be negotiated if software is significantly not up to industry norms.” Buyers don’t expect perfection, but if there are more than an expected number of issues to address, the buyer’s perspective on the deal might change, Odence says. It’s rare for due diligence discoveries to kill a deal, but they could impact deal terms, timing, or valuation.


The Anatomy of a Comprehensive Penetration Test

The goal of a penetration test should be to deliver a blueprint for achieving an improved security posture so these organizations can be set up for success. This means including best practices for fixing any issues where specific implementation details are not known by the pentester. It doesn’t stop with just a list of diagnoses for vulnerabilities. A complete inventory of all assets should also be included, with detail on the asset type, IP address, and geolocation information. This will provide visibility into how large an organization’s attack surface is and allow teams to understand which issues should take priority when multiple are found. No asset or resource should be considered “out of scope” when conducting a penetration test. This includes not only the web application itself, but also any external resources that it relies on, including API servers and third-party integrations. Developers may claim that since they didn’t create those resources they shouldn’t be on the hook to secure them, but the organization still needs to be accountable because it is using them. 


The Art Of Letting Go: How Data Minimization Can Improve Cybersecurity And Reduce Cost

One of the biggest challenges organizations face when it comes to implementing data minimization is determining what data is necessary to keep and what can (or should) be disposed of. With the vast amount of data generated and collected every day, it can be overwhelming to know what data you have in the first place, what’s important (or critical or sensitive or regulated) and what data can—or should—be discarded. By reducing the amount of data stored, organizations can decrease their risk of data breaches and improve regulatory compliance. Data minimization can also streamline data management processes, leading to increased efficiency and cost savings. So, how does one begin the process of data minimization? It all starts with knowing your data. Organizations need to have a clear understanding of what data they are collecting, how sensitive it is and how it is being used. This can help identify unnecessary data—often called redundant, obsolete or trivial (ROT) data—that can be safely disposed of.


Five steps to champion a data product strategy

Treating data like a product gives more structure to the ownership, processes, and technology needed to provide the organisation with access to clean, curated, continuously-updated data. So, the data product becomes a consumption-ready set of high-quality, trustworthy, accessible data that can be applied to solve genuine business challenges. In short, it’s the best version of data available to service a defined purpose and achieve a desired outcome for the business. ... The first step once your strategy is signed off is to develop a minimum viable data product (MVDP). Start small so you can release quickly, before iterating and delivering further capabilities. Each release of your data product should offer a little more value. This will help drive adoption, as well as showing returns which will help you secure any additional funding or resources required. Success will of course also depend on your LOB partners understanding how to use the data product as part of their existing working processes. It is rare that adding a new process will be widely and successfully adopted. 


Preventing artificial deception in the age of AI

Managing the concerns without stifling the potential of AI is the key challenge facing regulators across the world. The US has chosen a hands-off approach, encouraging private sector investment and prioritising AI research and development. China has opted for a centralised system focused on economic development and societal governance. The EU has focused more on regulation emphasising transparency, accountability, and protection of human rights. This includes proposed new regulations to establish standards for AI development and deployment, including strict rules for high-risk AI applications and biometric data usage, aiming to build trust in AI through transparency and accountability while ensuring safety and ethical considerations. The UK has adopted what it is calling a pro-innovation approach by enabling current regulators to determine how best to adapt existing regulation to the deluge of AI development and progress using a set of common principles. Whichever approach is adopted, a new regulatory mindset will be required to keep up with the pace of change.


12 ways IT leaders can build business buy-in

Modern CIOs know to speak in business terms and leave the tech jargon behind. But those who are truly intertwined with their business unit colleagues are speaking not only about strategy but key components of it: growth, revenue, profit margin, and so on. As Kande explains, “The business is asking for technology to deliver business outcomes: Are we selling more products and services? Do we have [for example] more visibility into manufacturing or supplies?” ... Another approach Juliano uses to ensure IT and business are in lockstep as they advance organizational objectives is to identify and highlight shared goals. For him, that means in part articulating IT’s piece of initiatives as well as demonstrating IT’s commitment to co-owning success — and, if things don’t go right, co-owning failure, too. “Your IT deliverables should be 100% part of the business’ strategic goals,” he says. “But if you’re making plans and you’re not seeing that there’s a clear IT objective, then you’re reducing your chance of successes and I’d question why you’re not part of that execution. So get your name on those goals so you are seen as a co-deliverer. Make sure your name is primary or secondary owner.”


Digital transformation: How to teach the language of change

While the CIO or CTO is often the first ambassador for a digital transformation, they need close collaboration from their peers to be successful. Those who oversee the processes and the people in the organization must work closely with the CEO to transform all three legs to keep the stool upright. That means the entire C-Suite – chiefs of operations, HR, finance, marketing, communications, and others – must be able to speak the language of digital transformation fluently. It will take some work. The CEO, along with the CTO/CIO, will need to teach their peers what digital transformation is all about and how to make it happen. Then they all need to share a common vision, a shared commitment, and a deep sense of accountability for the success of the digital transformation. ... Anyone who has undergone a digital transformation knows that it is one of the most significant undertakings an organization can face. It reaches into every corner of the business, from operations to customer satisfaction to employee culture. And it sets the tone for the next transformation, whenever that may be.



Quote for the day:

"Not all readers are leaders, but all leaders are readers." -- Harry S. Truman

Daily Tech Digest - April 02, 2023

4 ways to make monolithic enterprise apps feel cloud-native

Fortunately, all hope is not lost if you have an enterprise app you can't containerize or "microservices-ize." Even if you can't "break down your monolith"—to quote the standard advice about how to modernize legacy apps—there are many practical steps you can take to make legacy, monolithic applications look and feel (in most respects) like modern, cloud-native apps. ... Most enterprise apps are compatible with the cloud in the sense that you can host them on cloud-based virtual machine (VM) instances using services from different cloud providers. By taking advantage of cloud hosting for enterprise apps, organizations can also leverage cloud automation tools, such as autoscaling policies and rules that automatically start or stop VMs. Deploying enterprise apps in a highly scalable, automated manner becomes possible. ... Although configuring load balancers for enterprise apps hosted in the cloud is not a strict requirement, doing so can substantially improve application performance and reliability. Load balancers help direct requests at application instances best able to handle them. 


What is Event Sourcing Design Pattern in Microservice Architecture? How does it work?

Event sourcing is a design pattern that stores the state of an application as a sequence of events in the order that they occurred, rather than simply storing the current state of the application. In microservice architecture, this pattern can help in achieving better scalability, fault tolerance, and resilience. Here’s how it works:An event is generated whenever a change occurs in the system. The event is persisted to an event store, which is essentially a log of all events that have occurred in the system. The current state of the system can be reconstructed at any time by replaying all of the events in the event store, in the order that they occurred. Each service in the microservice architecture can have its own event store, which can be used to maintain its own state. Services can subscribe to events that are relevant to them and update their own state accordingly. By using event sourcing, it is possible to achieve better fault tolerance and resilience. 
 

Why Cloud Data Replication Matters

There are many reasons for building applications that understand replication, with or without cloud support. These are basic topics that any developer has had to deal with, but they are even more important when applications go global and/or mobile. Then they need ways to keep data secure and located efficiently. ... As you can see, data replication does not explicitly depend on using cloud resources. Enterprises have been using their internal networks for decades with some of the same benefits. But with the addition of cloud-based resources, the opportunity to have extremely high availability and performance is easier than ever. Traditional data replication has now been extended beyond just replicating from a PC to a network or between two servers. Instead, applications can replicate to a global network of endpoints that serve multiple purposes. Traditionally, replication was used to preserve data in case of a failure. For example, replicas could be copied to a node if there was a failure, but replicas could not be used directly by an application.


Composable Architecture - Part 1

Composable architecture refers to an architectural pattern that allows for the creation of systems made up of small, independent, and interchangeable components. These components can be developed and deployed independently and can be combined together to create new applications or features. In this way, composable architecture is similar to the concept of microservices, where a system is broken down into small, self-contained units that can be managed and developed independently. Today, using Module Federation, it is possible to go beyond micro services and enable independent product squats to build robust micro applications that can be composed into a seamless functionality for both retail and corporate banking clients. ... By using micro frontends, banks can break down their front-end application into smaller, independent components that can be developed and deployed independently. This allows product squads to work independently and develop micro apps that can be composed together on the front end to make it seamless for the banking customer. 


How to Become an Enterprise Architect

An enterprise architect must effectively serve as a central nervous system of the business, making sure its various organs operate in a greater harmony – even when these local channels are out of communication with each other. As you might expect, this almost always comes down to a continuous stream of meetings that scale anywhere from big-picture issues like roadmap orientation and long-term strategies to more nuts-and-bolts implementation and putting out fires. ... While the broad skill set and think-outside-the-box mindset entailed in the job allows for some degree of flexibility, recruiters for potential enterprise architects typically look for an undergraduate degree in computer science or related discipline. Enterprise architects with a master’s degree will have even more solid prospects. Job candidates are expected to have a minimum of five years in the IT field – preferably a decade or more. ... As enterprise architects continue to become more sought after, an aspiring professional will benefit from a resume with certification in skills that custom-fit the position in question.


Tracking the Evolving Cybersecurity Jobs Landscape for 2023

For someone who is looking to start out in cybersecurity, a direct admit role to security is rare, especially as it bypasses the potential growth and opportunities to learn from other positions. Combs recommends finding a path through systems administration or another foundational area that provides contextual information to build on. “You can’t protect an entity if you don’t understand the systems that comprise the entity,” he says. Even as professionals become more experienced in cybersecurity, they must remain committed to their specialization with or without constant acknowledgment of their success. ... “On your best day, nothing happened. And then on your worst day, one shift where something happened, everybody’s like, ‘Why did you let this happen?’ But you’re like, ‘Nothing happened all those other days!’” With more open-source learning and professional development resources available than ever, Combs advises up-and-coming professionals to take advantage of all of them, but advises against bootcamps that make assertions of guaranteed employment.


Cybersecurity experts argue that pausing GPT-4 development is pointless

Despite concerns about the societal risks posed by generative AI, many cybersecurity experts are doubtful that a pause in AI development would help at all. Instead, they argue that such a pause would provide only a temporary reprieve for security teams to develop their defenses and prepare to respond to an increase in social engineering, phishing and malicious code generation. ... “We need a new class of AI trust, risk and security management [TRiSM] tools that manage data and process flows between users and companies hosting LLM foundation models. These would be [cloud access security broker] CASB-like in their technical configurations but, unlike CASB functions, they would be trained on mitigating the risks and increasing the trust in using cloud-based foundation AI models,” Litan said. As part of an AI TRiSM architecture, users should expect the vendors hosting or providing these models to provide them with the tools to detect data and content anomalies, alongside additional data protection and privacy assurance capabilities, such as masking.


Overcoming obstacles to introduce zero-trust security in established systems

Even small and midsize enterprises have undergone significant changes in recent years. To become more flexible, they embraced trends like remote work, BYOD, decentralized IT, and moved their operations to the cloud. It is not an exception that a business has no private network but naturally still has systems, apps, data and employees that need protection from still growing online threats. When balancing the security of this new reality with the needs of remote workforce, it is not only about the technology, which should be easy to use, reliable and not disturbing to the user, but also about employee education. Employees should be trained in new policies so that they don’t resist the new security procedures, but actively contribute to them. To give an example, multifactor authentication should not be seen as a nuisance but a standard part of logging in and second nature to all employees. Regular phishing drills should be conducted to train workers in spotting and reporting them.


Tech Titans: The crossroads of humanity and technology

“You cannot escape technology – tech is intertwined no matter what we do; it has become a utility like water, heat, and electricity. Not having access to technology can be detrimental to having access to the essentials of daily life. We need to work together to ensure that everyone – especially the underserved and disenfranchised communities – have access to technology.” ... But technology by itself is not the only deterrent, Hannah noted. Even if technology is available, the cost could be a challenge in such communities – especially when for those who struggle to provide necessities like food. Literacy in these communities is another key challenge. Pegues characterizes digital redlining as a socio-economic problem at the crossroads of humanity and technology, which triggered a question about technology’s ability to combat digital redlining. While acknowledging that “old habits are hard to break,” Pegues also called out the possibility of detecting and addressing the algorithms that advance this practice.


Why AI Could be Critical in Reducing the Court Backlog

Today, nearly every criminal investigation includes a substantial digital evidence component, whether that be texts, WhatsApp messages, emails, internet browser histories, or data from GPS devices and fitness trackers. Research conducted by the University of Exeter in conjunction with the UK police force, found that digital forensics are now used in over 90% of criminal cases. But our ever-growing digital footprint presents a significant challenge for criminal defence lawyers who now need to review unprecedented volumes of data ahead of trial. Legal teams that fail to thoroughly analyse all this information are at risk of not meeting evidentiary thresholds, as in the case of Liam Allan, where 40,000 text messages were omitted from evidence and caused the case to fall apart at the last minute. The pandemic may have prompted the UK courts to revise traditional methods of working and embrace new technology, such as video conferencing software for remote trials, but there is still much work to be done. 



Quote for the day:

"A leader or a man of action in a crisis almost always acts subconsciously and then thinks of the reasons for his action." -- Jawaharlal Nehru