Daily Tech Digest - October 03, 2022

Roadmap to RPA Implementation: Thinking Long Term

Ted Kummert, executive vice president of products and engineering at UiPath, says RPA should be viewed as a long-range capability meant to empower organizations to evolve strategically and increase business value. It is a journey that can start small, within one division or one department, and grow organically across the business as additional ideas form and the organization’s vision for automation’s potential comes to fruition. He says RPA can clear backlog, create new capacity, and free up resources, and improve data quality by integrating software robots into workflows. “It is a truly transformative technology that can reduce or eliminate manual tasks and elevate creative, high-value work,” Kummert says. “Digital transformation is often talked about, but many times can fall short of its goals. Automation is the driver to achieve true digital transformation.” Adam Glaser, senior vice president of engineering for Appian, says many businesses use one automation technology, adding third-party capabilities in patchwork fashion to automate complex end-to-end processes.


How to start and grow a cybersecurity consultancy

To be successful, an entrepreneur must be resilient. Any comment that runs along the lines of “That’s not possible,” or “That can’t be done” should be treated as a challenge to prove the speaker wrong. An entrepreneur needs to have the ability to see through what’s not important. Entrepreneurs don’t just need money – they also need support in the form of encouragement and advice. I would advise budding entrepreneurs to attend meetups within their industry or local community and seek out online support via forums and groups. You’ll be surprised just how willing others will be to help and offer advice for free. Asking questions, getting reassurance and sanity checks from peers can be invaluable at all stages of your businesses journey. There will always be someone a little further down the path you’re taking. Starting a business can be exhilarating, rewarding and fun, but can be exhausting, relentless and stressful in equal measure.


Surveillance tech firms complicit in MENA human rights abuses

“When operating in conflict-affected or high-risk regions as the MENA region, the surveillance sector must undertake heightened human rights due diligence and, if it cannot do so or it identifies evidence of harm, it should stop selling its technology to companies or governments,” said Dima Samaro, MENA regional researcher and representative at the Business & Human Rights Resource Centre. “Lack of adequate due diligence measures by private companies will only worsen the situation for those from marginalised communities, putting their lives in jeopardy as the absence of robust regulation and effective mechanisms in the region allows surveillance technologies to be operated freely and without scrutiny.” The report added that, although the United Nations’ (UN) Guiding principles on business and human rights were adopted a decade ago – which establish that companies must take proactive and ongoing steps to identify and respond to the potential or actual human rights impacts of their business – the principles’ non-binding, voluntary nature means there are “glaring gaps in human rights safeguards” at the firms.


How companies can accelerate transformation

Ensuring that customer value drives technology architecture and investment is one way to optimize technology usage. Another way is to ensure that an organization is getting the most out of the investments it has already made. Inefficiency in any aspect of technology usage represents a drag on businesses’ ability to change quickly. ... While enterprise architects (EAs) play a central role in identifying opportunities for this type of technology optimization, they have an even greater role to play when it comes to optimizing the entire IT landscape. A “business capability” perspective makes this possible. ... Efficiency doesn’t improve on its own. The business needs to decide to improve it. Making those decisions, however, is not always easy. As mentioned, relying on business capabilities to evaluate technology needs is one way to simplify the decision process. The other is visibility. Business leaders can’t make decisions if they can’t see the problem. In terms of business architecture, EAs help guide leaders in the decisions they make by showing them business capability maps, data-rich process diagrams and dashboards highlighting the connection between architectural issues and business value.


Optus reveals extent of data breach, but stays mum on how it happened

Optus says its recent data breach impacted 1.2 million customers with at least one form of identification number that is valid and current. The Australian mobile operator also has brought in Deloitte to lead an investigation on the cybersecurity incident, including how it occurred and how it could have been prevented. Optus said in a statement Monday that Deloitte's "independent external review" of the breach would encompass the telco's security systems, controls, and processes. It added that the move was supported by the board of its parent company Singtel, which had been "closely monitoring" the situation. Elaborating on Deloitte's forensic assessment, Optus CEO Kelly Bayer Rosmarin said: "This review will help ensure we understand how it occurred and how we can prevent it from occurring again. It will help inform the response to the incident for Optus. This may also help others in the private and public sector where sensitive data is held and risk of cyberattack exists." In its statement, Optus added that it had worked with more than 20 government agencies to determine the extent of the data breach.


Why cyber security strategy must be more than a regulatory tick-box exercise

While technology plays a critical role in an effective cyber security strategy, it alone does not provide the solution. Business leaders must also consider the organisation’s processes and people. If organisations don’t have the right processes or people in place to manage new technologies, it can be easy to revert to old habits. Many organisations opt for a hybrid Security Operations Centre to underpin their MDR strategy, which combines the cyber skills of in-house engineers, cyber security teams and an MSSP to create a single facility. MSSPs fill in the gaps in defences while upskilling in-house teams to stay on top of changing threats and technologies. This approach can also free in-house staff to drive projects and internal improvements while the MSSP takes the lead on high value incidents. If the goal is to improve cyber security whilst meeting your organisational goals, then regulations will only ever go so far in tackling the issue. Attacks will continue to plague all sectors and proper detection, response and remediation will be what makes the difference between those that make the news and those that don’t.


Mozilla is looking for a scapegoat

Not so long ago, Microsoft’s Internet Explorer dominated market share. Antitrust authorities helped change that, but Google, not Mozilla, stepped up to take Microsoft’s place, yet without the bully pulpit of a dominant operating system. Meanwhile, as far back as 2008, I was writing about Mozilla’s chance to make Firefox a true community-developed web platform. It didn’t succeed, though Mozilla has gifted us incredible innovations such as Rust. Clearly there are smart people at Mozilla and they have demonstrated the ability to push the envelope on innovation. But not with Firefox. DuckDuckGo has carved out a growing, sizeable niche in privacy-oriented search, but Mozilla keeps losing similar ground in browsers. Why? In its report, Mozilla says browser freedom has been “suppressed for years through online choice architecture and commercial practices that benefit platforms and are not in the best interest of consumers, developers, or the open web.” This would be more credible in Mozilla’s mouth if this weren’t the same company that completely mismanaged its entrance into the mobile market.


Indonesia Data Protection Law Includes Potential Prison Time

The Indonesia data protection law took some eight years to come to fruition, with contentious ongoing debate about what government body should oversee the new regulations and exactly how strong the penalties should be. A recent wave of cyber attacks and data breaches in the country seems to have prompted legislative action; Kaspersky reports that the country experienced 11.8 million cyberattacks in the first quarter of 2022, a 22% increase from the prior year, and the country has become the leading target for ransomware attacks in Southeast Asia. This includes data breaches of various government agencies, one of which exposed the vaccination records of President Joko Widodo. Stats from SurfShark indicate that Indonesia now has the third-highest rate of data breaches in the world. Regulation oversight has fallen to the executive branch, with the President slated to form an oversight body tasked with determining and administering fines. Similar to the EU’s General Data Protection Regulation (GDPR), which the Indonesia data protection law drew from substantially, there is a maximum potential fine of 2% of global annual turnover for violations.


How To Protect Your Reputation After A Hack Or Data Breach

Part of transparency and recovery is working with the relevant authorities and experts to track the scope of the breach. A post-mortem analysis can be critical. For one thing, it can determine what data was stolen, by who and how. It can also help track where that data ends up and how it is used. In cases where the cause has something to do with software or hardware being exploited, it can be essential to inform the developers or manufacturers of the breach and how it occurred. They may also need to issue patches or recalls to prevent other businesses using that hardware or software from being compromised. No business stands alone. ... Recovery after a breach is a sensitive time. You will undoubtedly see a deluge of negative reviews and bad press, which will be difficult to counteract. Clear and transparent messaging is part of it; breaches happen, and there's no surefire way to avoid them. Demonstrating that your data security policies prevented usable data from being stolen or that you've been able to protect users proactively can be critical to repairing your reputation.


Data quality is at the heart of successful data governance

The downstream effects of data quality have ramifications felt throughout data governance efforts. Recent findings from a survey by Enterprise Strategy Group showed that data management is greatly challenged by a lack of visibility and compounded by data quality issues. Concerningly, 42 percent of all respondents indicated at least half of their data was “dark data” - retained by the organization, but unused, unmanageable, and unfindable. An influx in dark data and a lack of data visibility often leads to downstream bottlenecks, impeding the accuracy and effectiveness of operational data. Data quality was the top driver for organizations’ data governance programs but was also the top challenge that these organizations have to overcome to maximize the return on their data governance efforts. When you consider the fact that many organizations are experiencing data quality issues, which are difficult to manage, and in many cases have significant amounts of data that is dark, there is a clear need for more robust data governance solutions providing data landscape transparency united with business context and guidance.



Quote for the day:

"Perhaps the ultimate test of a leader is not what you are able to do in the here and now - but instead what continues to grow long after you're gone" -- Tom Rath

Daily Tech Digest - October 02, 2022

CIOs Still Waiting for Cloud Investments to Pay Off

“You discover your cloud architecture is immature when you’re surprised by the bill,” said Mr. Roese. “Usually the reason it was expensive is that they processed the data in a suboptimal way, in the wrong place with the wrong tools with the wrong economic model.” Roughly 67% of 1,000 senior technology leaders at U.S. firms across industries said they have yet to see a significant return on cloud investments, KPMG said Thursday in its annual technology survey. “Many first movers expected significant IT cost efficiency from their cloud investments,” said Barry Brunsman, a principal in KPMG’s CIO Advisory group. “We have seen a shift away from that expectation in favor of speed and agility.” Mr. Brunsman said the most common issues preventing a better return on cloud spending were insufficient talent or skills among tech teams, added security and compliance requirements, and a misalignment on expected outcomes. Global cloud spending is projected to reach $830.5 billion by the end of the year, up 17.5% from 2021, but slowing from last year’s growth rate of 18.3%, according to International Data Corp. It expects growth to drop to 16.3% next year.


To Code or Not to Code: The Benefits of Automating Software Testing

The hoped-for solution to these problems has been test automation. This offers a way to reduce manual involvement, test greater volumes, remove the risk of human error and accelerate time to market as much as tenfold for a serious competitive advantage. Yet, even organizations that have rolled out automated testing have discovered that a big problem remains. They can’t easily scale these solutions because professional-grade coding skills are still required. Even if they are marketed as “low-code,” they’re still far too complex for business users. Even testers usually lack the coding skills to set up tests on their own. As a result, coding skills remain a resource bottleneck that slows down the testing process and limits collaboration with business users. What’s more, as automated testing becomes more pervasive throughout the organization, the maintenance workload grows. All those test engineers an organization hired to implement an automation framework increasingly spend their time maintaining code instead of building bigger and smarter test scopes. Scaling becomes impossible.


Where JavaScript is headed in 2022

Angular is showing ominous signs of weakness around retention and interest, ranking near the bottom at #9. Nevertheless, it remains #2 for actual usage, and #3 for awareness. Vue continues to be a strong contender, with a decent ranking across all categories. Overall the story on the front end is of incremental refinements, rather than revolutionary upheaval. And on the back end? Next.js instigated the full-stack JavaScript movement, and remains second only to Express in both awareness and usage. The comparison of Next to Express is of course imperfect. Express is a server-side framework only, the workhorse of Node-based HTTP. ... Another interesting discovery in relation to languages that compile to JavaScript is the popularity of Elm. Elm is an ingenious functional language geared for web development, and highly regarded for its enablement of fast and fluent applications. But it’s also a mothballed project without any commits for months. The takeaway? Clearly the basic ideas in Elm are still desirable and popular. Perhaps a new leader could take up the project and carry it forward to the benefit of the entire ecosystem.


Blockchain: An Immutable Future?

The general consensus is that the pandemic delayed the adoption of distributed ledger technologies. Companies worldwide felt the repercussions of supply chain disruption and changes to consumer habits, which meant that the implementation of blockchain fell low on priority lists. However, in some cases, blockchain technology was used effectively to coordinate logistics. Despite the limited use, the global crisis helped drive further discussion about blockchain’s benefits; for example, ledger technologies could potentially have helped counter the fake vaccines that flooded the market during the height of the pandemic. This topic also feeds into a wider conversation about the influx of counterfeit medicines in pharma supply chains. ... The computing power needed to mine (add information to the blockchain) plus the duplication of work is an obvious source of environmental concern. Beyond environmental challenges, every piece of data added to a blockchain needs to be transcribed onto every copy of the blockchain, resulting in a much greater cost than server or cloud storage. 


The 5 Absolute Best X-Factors That Increase Enterprise Value

The problem with the traditional business model is that you don’t know when clients return, if at all. Both cash flow and forecasting are problematic. The first step in transforming your business model is implementing recurring revenue. Again, you can look to Software as a Service (SaaS) for inspiration. Chances are, you pay a monthly fee for your favorite movie streaming service. Customers don’t miss hidden billing surprises as they know exactly what they’ll pay. You enjoy the predictability of cash flow and now have accurate budgets. The second step in transforming your business model is creating long-term exclusive contracts. Your mission is to show clients why they are better off with a long-term commitment. ... A rich and thriving culture is the foundation of your business. It’s your culture that plays a role in the customer experience. Your culture determines if you leverage a market opportunity or not. Buyers look for a culture that promotes resilience, innovation, and accountability. Your future buyer knows that people come and go in a business, including you. A rich and thriving culture transcends people and takes a life of its own.


Three Ways Banks Can Engage Younger Consumers in the Metaverse

The metaverse lets banks roll out the virtual red carpet for customers, with tailored experiences for specific segments and personas. Personalized virtual banking enables that special something that leaves customers feeling valued. Within a metaverse branch, banks can create virtual rooms in which avatars of relationship managers and customer advisors work one-on-one with high-net-worth individuals, for instance. They might also provide services to individuals looking to create a college fund or businesses interested in obtaining loans. Metaverse banking’s combination of personalization and community puts a fresh, modern spin on CX, and it’s an especially powerful draw for young banking consumers who are critical to the future of banking. ... To connect with the next generation of connected consumers, banks must begin building their presence among the more popular metaverses and increase engagement with younger demographic audiences through 3D banking, personalized services and DAOs. The good news? For payment providers and retail and commercial banks, there are no obstacles surrounding the preparation, and it is not too late to get ahead. 


Biology Inspires a New Kind of Water-Based Circuit That Could Transform Computing

Since this is closer to the way the brain transports information, they say, their device could be the next step forward in brain-like computing. "Ionic circuits in aqueous solutions seek to use ions as charge carriers for signal processing," write the team led by physicist Woo-Bin Jung of the Harvard John A. Paulson School of Engineering and Applied Sciences (SEAS) in a new paper. "Here, we report an aqueous ionic circuit… This demonstration of the functional ionic circuit capable of analog computing is a step toward more sophisticated aqueous ionics." A major part of signal transmission in the brain is the movement of charged molecules called ions through a liquid medium. Although the incredible processing power of the brain is extremely challenging to replicate, scientists have thought that a similar system might be employed for computing: pushing ions through an aqueous solution. This would be slower than conventional, silicon-based computing, but it might have some interesting advantages.


Failure of Russia’s cyber attacks on Ukraine is most important lesson for NCSC

She said three things could be attributed to the “unexpected” lack of success – “impressive” Ukrainian cyber defences, “incredible” support from the cyber security sector and “impressive collaboration” between the US, EU, Nato, the UK, and others. “Just as we have seen inspirational and heroic defence by Ukrainian military on the battlefield, we have seen incredibly impressive defensive cyber operations by Ukrainian cyber security practitioners. Many commentators have suggested that this has been the most effective defensive cyber activity undertaken under sustained pressure in history,” added Cameron. She said the constant cyber attacks on Ukraine, emanating from Russia, over the past decade had prepared the country’s cyber defences. “In many ways, Russia has made Ukraine match fit over the past 10 years by consistently attacking them,” said Cameron. “We haven’t seen ‘cyber Armageddon’, but that’s not a surprise to cyber professionals, who never expected it. What we have seen is a very significant conflict in cyber space – probably the most sustained and intensive cyber campaign on record.”


Feds: Chinese Hacking Group Undeterred by Indictment

The United States began publicly indicting Chinese hackers in 2014 in a strategy to pressure Beijing by exposing the organizations and individuals behind state-sponsored cybertheft. The strategy seemed to pay dividends when Chinese leader Xi Jinping in September 2015 pledged to cease cyber-enabled economic espionage. The strategy's utility has since come under mounting fire as it became apparent that Chinese state-sponsored hacking responded to Xi's promise by becoming stealthier rather than by ending. "The indictment did not hinder APT41's operations as they progressed into 2021," concludes the Department of Health and Human Services' Health Sector Cybersecurity Coordination Center in a Thursday threat brief. "Stealing foreign IP is a primary objective of state-sponsored Chinese cyberespionage groups such as APT41, as it contributes to China's ambitious business and economic development goals," says Paul Prudhomme, a former Department of Defense threat analyst who is head of threat intelligence advisory at Rapid7.


What Is Artificial Intelligence in Software Testing?

You may wonder, “Don’t test automation tools do this already?” Of course, test automation tools already have AI in effect, but they have limitations. Where AI shines in software development is when it’s applied to remove those limitations, enabling software test automation tools to provide even more value to developers and testers. The value of AI comes from reducing the direct involvement of the developer or tester in the most mundane tasks. We still have a great need for human intelligence in applying business logic, strategic thinking, creative ideas, and the like. For example, consider that most, if not all, test automation tools run tests for you and deliver results. Most don’t know which tests to run, so they run all of them or some predetermined set. What if an AI-enabled bot could review the current state of test statuses, recent code changes, code coverage, and other metrics, and then decide which tests to run and run them for you? Bringing in decision-making that’s based on changing data is an example of applying AI. Good news! Parasoft handles automated software testing at this level.



Quote for the day:

"It's not the position that makes the leader. It_s the leader that makes the position." -- Stanley Huffty

Daily Tech Digest - October 01, 2022

3 wins and 3 losses for cloud computing

The cloud can successfully provide business agility. I always tell my clients that most enterprises moved to cloud for the perceived cost savings but stayed for the agility. Cloud provides businesses with the ability to turn IT on a dime, and enterprises that move fast in today’s more innovative markets (such as retail, healthcare, and finance) find that the speed at which cloud systems can change provides a real force multiplier for the business. The cloud offers industrial-strength reliability. Most who pushed back on cloud computing argued that we would put all our eggs in a basket that could prove unreliable. ... The businesses that moved to cloud computing anticipated significant cost savings. Those savings never really materialized except for completely new businesses that had no prior investment in IT. In fact, most enterprises looked at their cloud bills with sticker shock. The primary culprit? Enterprises that did not use cloud finops programs to effectively manage cloud costs. Also, cloud providers offered pricing and terms that many enterprises did not understand (and many still don’t).


Data storytelling: A key skill for data-driven decision-making

Rudy is a firm believer in letting the data unfold by telling a story so that when the storyteller finally gets to the punch line or the “so what, do what” there is full alignment on their message. As such, storytellers should start at the top and set the stage with the “what.” For example, in the case of an IT benchmark, the storyteller might start off saying that the total IT spend is $X million per year (remember, the data has already been validated, so everyone is nodding). The storyteller should then break it down into five buckets: people, hardware, software, services, other (more nodding), Rudy says. Then further break it down into these technology areas: cloud, security, data center, network, and so on (more nodding). Next the storyteller reveals that based on the company’s current volume of usage, the unit cost is $X for each technology area and explains that compared to competitors of similar size and complexity, the storyteller’s organization spends more in certain areas, for example, security (now everyone is really paying attention), Rudy says.


Active vs. Passive Network Monitoring: What Should Your Company Use?

Active network monitoring, also known as synthetic network monitoring, releases test traffic onto the network and observes that traffic as it travels through. This traffic is not taken from actual transactions that occur on a network, but rather sent through the network in order for your monitoring solution to examine it on its path. Test traffic usually mimics the typical network traffic that flows through your system so your administrators will gain the most relevant insights to its network. ... Passive network monitoring refers to capturing network traffic that flow through a network and analyzing it afterwards. Through a collection method like log management or network taps, passive monitoring compiles historic network traffic to paint a bigger picture of your company’s network performance. The primary use for passive network monitoring is for discovering and predicting performance issues that happen at specific instances and areas of your network. ... The question that might be passing through your mind is “should my business use active monitoring or passive monitoring for my network performance strategy?”


A New Linux Tool Aims to Guard Against Supply Chain Attacks

“Not too long ago, the only real criteria for the quality of a piece of software was whether it worked as advertised. With the cyber threats facing Federal agencies, our technology must be developed in a way that makes it resilient and secure,” Chris DeRusha, the US federal chief information security officer and deputy national cyber director, wrote in the White House announcement. "This is not theoretical: Foreign governments and criminal syndicates are regularly seeking ways to compromise our digital infrastructure.” When it comes to Wolfi, Santiago Torres-Arias, a software supply chain researcher at Purdue University, says that developers could accomplish some of the same protections with other Linux distributions, but that it’s a valuable step to see a release that’s been stripped down and purpose-built with supply chain security and validation in mind. “There’s past work, including work done by people who are now at Chainguard, that was kind of the precursor of this train of thought that we need to remove the potentially vulnerable elements and list the software included in a particular container or Linux release,” Torres-Arias says.


Using governance to spur, not stall, data access for analytics

“Without good governance controls, you not only have the policy management risk, but you also risk spending much, much more money than you intend, much faster,” says Barch. “We knew that maximizing the value of our data, especially as the quantity and variety of that data scales, was going to require creating integrated experiences with built-in governance that enabled the various stakeholders involved in activities like publishing data, consuming data, governing data and managing the underlying infrastructure, to all seamlessly work together.” What does this blended approach to data governance look like? For Capital One, it’s what Barch calls “sloped governance.” With a sloped governance approach, you can increase governance and controls around access and security for each level of data. For example, private user spaces, which don’t contain any shared data, can have minimal data governance requirements. As you move further into production, the controls get stricter and take more time to be implemented.


Microsoft: Hackers are using open source software and fake jobs in phishing attacks

The hacking group has targeted employees in media, defense and aerospace, and IT services in the US, UK, India, and Russia. The group was also behind the massive attack on Sony Pictures Entertainment in 2014. Also known as Lazarus, and tracked by Microsoft as ZINC, Google Cloud's Mandiant threat analysts saw the group spear-phishing targets in the tech and media sectors with bogus job offers in July, using WhatsApp to share a trojanized instance of PuTTY. "Microsoft researchers have observed spear-phishing as a primary tactic of ZINC actors, but they have also been observed using strategic website compromises and social engineering across social media to achieve their objectives," MSTIC notes. "ZINC targets employees of companies it's attempting to infiltrate and seeks to coerce these individuals into installing seemingly benign programs or opening weaponized documents that contain malicious macros. Targeted attacks have also been carried out against security researchers over Twitter and LinkedIn."


New deepfake threats loom, says Microsoft’s chief science officer

In a Twitter thread, MosaicML research scientist Davis Blaloch described interactive deepfakes as “the illusion of talking to a real person. Imagine a scammer calling your grandmom who looks and sounds exactly like you.” Compositional deepfakes, he continued, go further with a bad actor creating many deepfakes to compile a “synthetic history.” ... The rise of ever-more sophisticated deepfakes will “raise the bar on expectations and requirements” of journalism and reporting, as well as the need to foster media literacy and raise awareness of these new trends. In addition, new authenticity protocols to confirm identity might be necessary, he added – even new multifactor identification practices for admittance into online meetings. There may also need to be new standards to prove content provenance, including new watermark and fingerprint methods; new regulations and self-regulation; red-team efforts and continuous monitoring.


How Does WebAuthn Work?

WebAuthn is quite clever. It leverages the power of public key cryptography to create a way for users to log in to mobile and web applications without those applications having to store any secret information at all. Usually, when one thinks of public key cryptography, one thinks of using it to send a secret message to a person who then decrypts it and reads it. Well, this can kind of work in reverse. If you send them a message encrypted with their public key, then they – and only they – are the only ones who can decrypt because only they have the private key that corresponds to the given public key. Once they do, you can be highly confident that they are the entity that they say they are. Currently, all the major browsers – Chrome, Firefox, Edge, and Safari – all support the WebAuthn specification. If your phone – iPhone or Android – has a fingerprint reader or facial scanner, it supports WebAuthn. Windows provides WebAuthn support via Windows Hello. All of this translates to passwordless authentication quite nicely.


Why developers hold the key to cloud security

APIs drive cloud computing. They eliminate the requirement for a fixed IT architecture in a centralized data center. APIs also mean attackers don’t have to honor the arbitrary boundaries that enterprises erect around the systems and data stores in their on-premises data centers. While identifying and remediating misconfigurations is a priority, it’s essential to understand that misconfigurations are just one means to the ultimate end for attackers: control plane compromise. This has played a central role in every significant cloud breach to date. Empowering developers to find and fix cloud misconfigurations when developing IaC is critical, but it’s equally important to give them the tools they need to design cloud architecture that’s inherently secure against today’s control plane compromise attacks. ... Developers are in the best (and often only) position to secure their code before deployment, maintain its secure integrity while running, and better understand the specific places to provide fixes back in the code. But they’re also human beings prone to mistakes operating in a world of constant experimentation and failure. 


IT enters the era of intelligent automation

Companies also need to optimize business processes to increase the effectiveness of automation, Nallapati says. “Working together in a partnership, the business unit and the automation teams can leverage their expertise to refine the best approach and way forward to optimize the efficiency of the bot/automation,” she says. Technology leaders should make sure to get business leaders and users involved in the IA process, Ramakrishnan says. “Educate them about the possibilities and collaborate with them in joint problem-solving sessions,” he says. ... “With a large number of customers and a large number of invoices to process every day, any small savings through automation goes a long way in increasing productivity, accuracy, and improving employee and end customer satisfaction,” Ramakrishnan says. Similar to the type of hackathons that are common in IT organizations today, Ramakrishnan says, “we partnered with the business to have a business-side hackathon/ideathon. We educated the key users from the billing team on the possibilities of automation, and then they were encouraged to come back with ideas on automation.”



Quote for the day:

"Effective team leaders realize they neither know all the answers, nor can they succeed without the other members of the team." -- Katzenbach & Smith

Daily Tech Digest - September 30, 2022

5 Signs That You’re a Great Developer

Programming directly changes your brain to work in another way, you’re starting to think more algorithmically and solve problems faster, so it really affects other aspects of your life Good programmers not only can learn anything else much faster, especially if we’re talking about tech-related directions, but also they are great examples of entrepreneurs and CEO. Look at Elon Musk, for instance, he was a programmer and built his own game when he was 12. ... As we briefly discussed previously, programming encourages creative thinking and teaches you how to approach problems in the most effective way. But in order to do so, you must first be able to solve a lot of these difficulties and have a passion for doing so; only then will you probably succeed as a developer. If you’ve just started and thought that easy, then you’re completely wrong. You just haven’t figured out genuinely challenging problems, and the more you learn, the more difficult and complex the difficulties get. Because you need to not only solve it, but also solve it in the most effective way possible, speed up your algorithm, and optimize everything. 


Experimental WebTransport over HTTP/3 support in Kestrel

WebTransport is a new draft specification for a transport protocol similar to WebSockets that allows the usage of multiple streams per connection. WebSockets allowed upgrading a whole HTTP TCP/TLS connection to a bidirectional data stream. If you needed to open more streams you’d spend additional time and resources establishing new TCP and TLS sessions. WebSockets over HTTP/2 streamlined this by allowing multiple WebSocket streams to be established over one HTTP/2 TCP/TLS session. The downside here is that because this was still based on TCP, any packets lost from one stream would cause delays for every stream on the connection. With the introduction of HTTP/3 and QUIC, which uses UDP rather than TCP, WebTransport can be used to establish multiple streams on one connection without them blocking each other. For example, consider an online game where the game state is transmitted on one bidirectional stream, the players’ voices for the game’s voice chat feature on another bidirectional stream, and the player’s controls are transmitted on a unidirectional stream. 
Software builders across Amazon require consistent, interoperable, and extensible tools to construct and operate applications at our peculiar scale; organizations will extend on our solutions for their specialized business needs. Amazon’s customers benefit when software builders spend time on novel innovation. Undifferentiated work elimination, automation, and integrated opinionated tooling reserve human interaction for high judgment situations. Our tools must be available for use even in the worst of times, which happens to be when software builders may most need to use them: we must be available even when others are not. Software builder experience is the summation of tools, processes, and technology owned throughout the company, relentlessly improved through the use of well-understood metrics, actionable insights, and knowledge sharing. Amazon’s industry-leading technology and access to top experts in many fields provides opportunities for builders to learn and grow at a rate unparalleled in the industry. As builders we are in a unique position to codify Amazon’s values into the technical foundations; we foster a culture of belonging by ensuring our tools, training, and events are inclusive and accessible by design.


The Troublemaker CISO: How Much Profit Equals One Life?

We take for granted that those who are charged with protecting us are doing so with our best interest at heart. There is no shaving off another few cents just to increase value to shareholders over the life of a person. Lucky for me, there is a shift in the boardrooms and governing bodies to see how socially responsible you are and whether you are acting in the best interest of the people and not just the investors. If the members of the board and governing body are considering these topics when steering a business, isn't it time to relook at how and why we do things? Are we as CISOs not accountable to leadership to impress on them the risk that IOT/internet connectivity poses to critical networks - and especially to healthcare? It is time to be firm in expressing the risk and saying we would rather spend a bit more money and time and do it the safe way. And this should be listed as the top risk in the company. The other big issue I have with this type of network being connected is one of transparency.


Digital Twins Offer Cybersecurity Benefits

A key difficulty, from a cybersecurity perspective, is the fact drug production lines are made up of multiple different technologies, running different operating systems that are often provided by different suppliers. “Integrating multiple systems from different suppliers can provide expanded attack surface that can be exploited by cyber adversary,” continues Mylrea. To address this, Mylrea and Grimes developed what they refer to as “biosecure digital twins”—replicas of manufacturing lines they use to identify potential points of attack for hackers. “The digital twin is essentially a high-fidelity virtual representation of critical manufacturing processes. From a security perspective, this improves monitoring, detection, and mitigation of stealthy attacks that can go undetected by most conventional cybersecurity defenses,” explains Mylrea. “Beyond security, the biosecure digital twin can optimize performance and productivity by detecting when critical systems deviate from their ideal state and correct in real time to enable predictive maintenance that prevent costly faults and safety failures.”


Unlocking cyber skills: This year’s essential back-to-school lesson plan

Technology is continually advancing, which will only create more avenues for cybersecurity roles in the future. While it’s essential to inform students about the types of careers in cybersecurity, teachers and career advisors should be aware of the skills and qualities the sector needs beyond technical computer and software knowledge. Once this is achieved, it can shed light on the roles students can go onto. Technical skills are critical in cybersecurity, yet they can be learned, fostered, and evolved throughout a student’s career. Schools need to tap into individual students’ strengths in hopes of encouraging them to pursue cyber positions. Broadly, cybersecurity enlists leaders, communicators, researchers, critical thinking… the list goes on. Having the qualities needed to fulfil various roles in the industry can position a student remarkably when they first start in the industry. Yet, this comes down to their mentors in high school being able to communicate that a student’s inquisitive nature or presenting skills can be applied to various sectors.


Data literacy: Time to cure data phobia

Data literacy is an incredibly important asset and skill set that should be demonstrated at all levels of the workplace. In simple terms, data literacy is the fundamental understanding of what data means, how to interpret it, how to create it and how to use it both effectively and ethically across business use cases. Employees who have been trained in and applied their knowledge of how to use company data demonstrate a high level of data literacy. Although many people have traditionally associated data literacy skills with data professionals and experts, it’s becoming necessary for employees from all departments and job levels to develop certain levels of data literacy. The Harvard Business Review stated: “Companies need more people with the ability to interpret data, to draw insights and to ask the right questions in the first place. These are skills that anyone can develop, and there are now many ways for individuals to upskill themselves and for companies to support them, lift capabilities,and drive change. Indeed, the data itself is clear on this: Data-driven decision-making markedly improves business performance.”


To BYOT & Back Again: How IT Models are Evolving

The growing complexity of IT frameworks is startling. A typical enterprise has upwards of 1,200 cloud services and hundreds of applications running at any given moment. On top of that, employees have their own smartphones, and many use their own routers and laptops. Meanwhile, various departments and groups -- marketing, finance, HR and others -- subscribe to specialized cloud services. The difficulties continue to pile up -- particularly as CIOs look to build out more advanced data and AI frameworks. McKinsey & Company found that between 10% and 20% of IT budgets are devoted to adding more technology in an attempt to modernize the enterprise and pay down technical debt. Yet, part of the problem, it noted, is “undue complexity” and a lack of standards, particularly at large companies that stretch across regions and countries. In many cases, orphaned and balkanized systems, data sprawl, data silos, and complex device management requirements follow. For CIOs seeking simplification and tighter security, the knee-jerk reaction is often to clamp down on choices and options.


IT leadership: What to prioritize for the remainder of 2022

To deliver product-centric value, it’s best to have autonomous, cross-functional teams running an Agile framework. Those teams can include technical practitioners, design thinkers, and business executives. Together, they can increase business growth by as much as 63%, Infosys’ Radar report uncovered. Cross-pollination efforts can spread Agile across the entire enterprise, building credibility and trust among high-level stakeholders toward an iterative process that can deliver meaningful, if incremental, business results. Big-bang rollouts, with a raft of modernizations released in one fell swoop, may seem attractive to management or other stakeholders. But they carry untold risk: developers scrambling to fix bugs after the fact, account teams working to retain disgruntled customers. Approach cautiously, and consider an Agile roadmap of smaller, iterative developments instead of the momentous release. It also breaks down the considerable task of application modernization into smaller, bite-sized chunks. 


How Policy-as-Code Helps Prevent Cloud Misconfigurations

Policy-as-code is a great cloud configuration solution because it eliminates the potential for human error and makes it more difficult for hackers to interfere. Policy compliance is crucial for cloud security, ensuring that every app and piece of code follows the necessary rules and conditions. The easiest way to ensure nothing slips through the cracks is to automate the compliance management process. Policy-as-code is also a good choice in a federated risk management model. A set of common standards are applied across a whole organization, although departments or units retain their own methods and workflows. PaC fits seamlessly into this high-security system by scaling and automating IT policies throughout a company. Preventing cloud misconfiguration relies on effectively ensuring every app and line of code is adhering to an organization’s IT policies. PaC offers some key benefits that make this possible without being a hassle. Policy-as-code improves the visibility of IT policies since everything is clearly defined in code format. 



Quote for the day:

"Without courage, it doesn't matter how good the leader's intentions are." -- Orrin Woodward

Daily Tech Digest - September 29, 2022

Hybrid work is the future, and innovative technology will define it

We’re starting to see an amplification of recognition tools, of coaching platforms, of new and exciting ways to learn that are leveraging mobility and looking at how people want to work and to meet them where they are, rather than saying, “Here’s the technology, learn how to use it.” It’s more about, “Hey, we’re learning how you want to work and we’re learning how you want to grow, and we’ll meet you there.” We’re really seeing an uptake in the HR tech space of tools that acknowledge the humaneness underneath the technology itself. ... The second layer consists of the business applications we’ve come to know and love. Those include HR apps, business applications, supply chain applications, and financial applications, et cetera. Certainly, there is a major role in this distributed work environment for virtual application delivery and better security. We need to access those mission-critical apps remotely and have them perform the same way whether they’re virtual, local, or software as a service (SaaS) apps -- all through a trusted access security layer.


Web 3: How to prepare for a technological revolution

It hardly needs to be said, but over the last few decades, the internet has grown to be arguably the most integral cog ensuring a smooth-running, functioning society. It is so ingrained that almost every industry in the world would be unable to function properly without it. And this reliance will only grow as Web 3 becomes the norm, which makes it critical that we begin to educate children now on its uses and how to navigate it. Already, many of today’s adults will find it difficult to explain what Web 3 is, let alone teach the next generation how to use it. Educating children early will not only help them thrive in the future, but they will also be able to pass gained knowledge up the chain to their parents. This is, of course, just history repeating itself. It is the equivalent of kids showing their parents how to use a touch screen or work their email. But the revolution Web3 is about to bring is on a different scale to any previous technological advancement. Soon, the greatest opportunities will be solely available on the new internet, and it is critical we ensure every child has the opportunity to succeed.


Health data governance and the case for regulation

Without appropriate data governance procedures and training in place, healthcare organizations are likely to find themselves in danger of noncompliance. HIPAA violations in particular can occur at any level of an organization; if an undertrained staff member or unsecured database is operating in your organization, there’s a huge likelihood that they will eventually misuse patient data and breach HIPAA regulations. This kind of breach can lead to noncompliance, fines, legal issues, poorer patient experiences and even a loss of trust within the greater medical community. Data governance means the difference between a successful and fully operational facility and a facility that gets shut down by the government. On the other hand, when data governance principles are applied successfully in the healthcare sector, a slew of benefits outside of basic compliance can be realized. Patients feel confident that their information is safe and begin to refer their friends and family members to your network. Data becomes easier to find, label and organize for new operational use cases and emerging patient technologies. 


Closing the Gap Between Complexity and Performance in Today’s Hybrid IT Environments

Nowadays, the increasing need for security on all fronts has fueled collaboration between teams on a regular basis. This, in turn, has spurred more proactivity from an internal IT operations perspective. Proactivity, bolstered by a unified view into traffic and communication, is a key aspect of closing the gap between cloud complexity and performance — because it starts at the IT cultural level. Technical capabilities like deep observability can support team prioritization of detection and management on a more holistic level, addressing all aspects of IT infrastructure. With this, organizations can feel more confident in overcoming cloud-based challenges and mitigating connected cyber vulnerabilities as a collective force. An all-encompassing, proactive approach is needed to speedily detect cyber threats, respond to the corresponding activity, and enact a remediation plan. Within hybrid and multi-cloud environments, data and communication costs can skyrocket. The most common use cases stem from packets, which can interfere with control of and visibility into the right data. 


Blockchain and artificial intelligence: on the wave of hype

Blockchain is an innovative digital information storage system storing data in an encrypted, distributed ledger format. In work, the data is encrypted and distributed across multiple computers, which creates tamper-proof. It is a secure database that can only be read and updated by those with permission. There are a few examples on the web today of blockchain and artificial intelligence being interconnected. Academics and scientists conducted the study. But we see the two concepts working well together. ... Today’s computers are extremely fast, but they also require a constant supply of data and instructions, without which it is impossible to process information or perform tasks. Therefore, the blockchain used on standard computers requires significant computing power because of the encryption processes. Secure data monetization could be the result of combining blockchain and artificial intelligence. Monetization of collected is a source of revenue for many companies. Among the big and famous ones are Facebook and Google resources.


How MLops deployment can be easier with open-source versioning

Among the many reasons why there are a growing number of vendors in the sector, a significant one is because building and deploying ML models is often a complicated process with many manual steps. A primary goal of MLops tools is to help automate the process of building and deploying models. While automation is important, it only solves part of the complexity. A key challenge for artificial intelligence (AI) models, that was identified in a recently released Gartner report, is that approximately only half of AI models actually end up making it into production. From Guttmann’s perspective, with application development, developers tend to have a linear way of building things. This implies that for example, new code written six months after the initial development is better than the original. That same view does not tend to work with machine learning as the process involves more research and more experimentation to determine what actually works best. “Development is always money sunk into the problem until you actually see the fruits of the effort and we want to decrease that development time to a minimum,” he said.


Robotic Process Automation Will Shape the Future of Hotel Operations

On the guest-facing side of the business, automation can be applied to virtually every touchpoint of the guest journey. Marketing automation comes in the form of upsell opportunities, re-marketing or recovery campaigns in the pre-stay, pre-check-in and post-cancellation stages. In the back of the house, automation is helping the marketing, revenue and sales departments get more done with fewer resources. Integrated CRM systems have become the heart of new, guest-centric personalization strategies such as automated email marketing programs that are proving to be huge time-savers. Revenue managers are tapping automation to stay on top of pricing and demand trends. RPA reduces the common challenges presented by running a business on a fragmented tech stack. Siloed systems often lead to a great deal of manual effort, such as copying, importing, exporting data from one system to another, or the common “swivel-chair integration.” Through RPA, operators can create workflows that fill feature gaps or replicate features from other systems, saving them time and money.


Russian hackers' lack of success against Ukraine shows that strong cyber defences work

Since the invasion, Cameron said, "what we have seen is a very significant conflict in cyberspace – probably the most sustained and intensive cyber campaign on record." But she also pointed to the lack of success of these campaigns, thanks to the efforts of Ukrainian cyber defenders and their allies. "This activity has provided us with the clearest demonstration that a strong and effective cyber defence can be mounted, even against an adversary as well prepared and resourced as the Russian Federation." Cameron argued that not only does this provide lessons for what countries and their governments can do to protect against cyberattacks, but there are also lessons for organisations on how to protect against incidents, be they nation-state backed campaigns, ransomware attacks or other malicious cyber operations. "Central to this is a commitment to long-term resilience," said Cameron. "Building resilience means we don't necessarily need to know where or how the threat will manifest itself next. Instead, we know that most threats will be unable to breach our defences. And when they do, we can recover quickly and fully."


The Unlikely Journey of GraphQL

GraphQL is drawing the spotlight because refactoring or modernization of applications into microservices is stressing REST to its limits. As information consumers, we expect more from the digital platforms that we use. Shop for a product, and we also will want to find reviews, competing offers, autofill keyword search, and likely other options. Monolithic apps crack under the load, and for similar reasons, the same fate could be happening to REST, which requires pinpoint commands to specific endpoints. And with complex queries, lots of pinpoint requests. Facebook developers created GraphQL as a client specification for alleviating the bottlenecks that were increasingly cropping up when fetching data from polyglot sources to a variety of web and mobile clients. With REST, developers had to know all the endpoints. By contrast, with GraphQL, the approach is declarative: specify what data you need rather than how to produce it. While REST is imperative, GraphQL is declarative. 


Cryptojacking, DDoS attacks increase in container-based cloud systems

The Sysdig repot also noted that there has been a jump in DDoS attacks that use containers since the start of Russian invasion of Ukraine. "The goals of disrupting IT infrastructure and utilities have led to a four‑fold increase in DDoS attacks between 4Q21 and 1Q22," according to the report. "Over 150,000 volunteers have joined anti‑Russian DDoS campaigns using container images from Docker Hub. The threat actors hit anyone they perceive as sympathizing with their opponent, and any unsecured infrastructure is targeted for leverage in scaling the attacks." Otherwise, a pro-Russian hacktivist group, called Killnet, launched several DDoS attacks on NATO countries. These include, but are not limited to, websites in Italy, Poland, Estonia, Ukraine, and the United States. “Because many sites are now hosted in the cloud, DDoS protections are more common, but they are not yet ubiquitous and can sometimes be bypassed by skilled adversaries,” Sysdig noted. “Containers pre‑loaded with DDoS software make it easy for hacktivist leaders to quickly enable their volunteers.”



Quote for the day:

"Good leaders value change, they accomplish a desired change that gets the organization and society better." -- Anyaele Sam Chiyson

Daily Tech Digest - September 28, 2022

How to Become an IT Thought Leader

Being overly tech-centric is a common mistake aspiring thought leaders make. Such individuals start with a technology, then look for problems to solve. “Instead, it's important to remember that an IT thought leader drives digital change,” Zhao says. “Understanding the technology is only one aspect of IT thought leadership.” Ross concurs. “I’ve seen several troubling examples of large technology purchases occurring before key business requirements were fully understood,” he says. “Seek first to understand the desired business outcomes and remember that technology is a potential enabler of those outcomes, but never a cure-all.” A strong business case is essential for any proposed new technology, Bethavandu says. “If your company is not ready for, say, DevOps or containerization, be self-aware and don’t push for those projects until your organization is ready,” he states. On the other hand, excessive caution can also be dangerous. “If you want to be a thought leader, you have to be bold and you cannot be afraid of failing,” Bethavandu says.


Most Attackers Need Less Than 10 Hours to Find Weaknesses

Overall, nearly three-quarters of ethical hackers think most organizations lack the necessary detection and response capabilities to stop attacks, according to the Bishop Fox-SANS survey. The data should convince organizations to not just focus on preventing attacks, but aim to quickly detect and respond to attacks as a way to limit damage, Bishop Fox's Eston says. "Everyone eventually is going to be hacked, so it comes down to incident response and how you respond to an attack, as opposed to protecting against every attack vector," he says. "It is almost impossible to stop one person from clicking on a link." In addition, companies are struggling to secure many parts of their attack surface, the report stated. Third parties, remote work, the adoption of cloud infrastructure, and the increased pace of application development all contributed significantly to expanding organizations' attack surfaces, penetration testers said. Yet the human element continues to be the most critical vulnerability, by far. 


Discover how technology helps manage the growth in digital evidence

With limited resources, even the most skilled law-enforcement personnel are hard-pressed to comb through terabytes of data that may include hours of videos, tens of thousands of images, and hundreds of thousands of words in the form of text, email, and other sources. One possible solution is to augment skilled investigators and forensic examiners with technology. Some of the key technological capabilities that can be applied to this problem are AI and machine learning. AI and machine learning models and applications create processes that read, watch, extract, index, sort, filter, translate, and transcribe information from text, images, and video. By utilizing technology to carve through and analyze data, it’s possible to reduce the data mountain to a series of small hills of related content and add tags that make it searchable. That allows people to spend their time and energy on work that is most valuable in the investigation. The good news is that help is available. Microsoft has multiple AI and machine learning processes within our Microsoft Azure Cognitive Services. 


The modern enterprise imaging and data value chain

The costs and consequences of the current fragmented state of health care data are far-reaching: operational inefficiencies and unnecessary duplication, treatment errors, and missed opportunities for basic research. Recent medical literature is filled with examples of missed opportunities—and patients put at risk because of a lack of data sharing. More than four million Medicare patients are discharged to skilled nursing facilities (SNFs) every year. Most of them are elderly patients with complex conditions, and the transition can be hazardous. ... “Weak transitional care practices between hospitals and SNFs compromise quality and safety outcomes for this population,” researchers noted. Even within hospitals, sharing data remains a major problem. ... Data silos and incompatible data sets remain another roadblock. In a 2019 article in the journal JCO Clinical Cancer Informatics, researchers analyzed data from the Cancer Imaging Archive (TCIA), looking specifically at nine lung and brain research data sets containing 659 data fields in order to understand what would be required to harmonize data for cross-study access.


Cloud’s key role in the emerging hybrid workforce

One key to the mistakes may be the overuse of cloud computing. Public clouds provide more scalable and accessible systems on demand, but they are not always cost-effective. I fear that much like when any technology becomes what the cool kids are using, cloud is being picked for emotional reasons and not business reasons. On-premises hardware costs have fallen a great deal during the past 10 years. Using these more traditional methods of storage and compute may be way more cost-effective than the cloud in some instances and may be just as accessible, depending on the location of the workforce. My hope is that moving to the cloud, which was accelerated by the pandemic, does not make us lose sight of making business cases for the use of any technology. Another core mistake that may bring down companies is not having security plans and technology to support the new hybrid workforce. Although few numbers have emerged, I suspect that this is going to be an issue for about 50% of companies supporting a remote workforce.


Why zero trust should be the foundation of your cybersecurity ecosystem

Recently, zero trust has developed a large following due to a surge in insider attacks and an increase in remote work – both of which challenge the effectiveness of traditional perimeter-based security approaches. A 2021 global enterprise survey found that 72% respondents had adopted zero trust or planned to in the near future. Gartner predicts that spending on zero trust solutions will more than double to $1.674 billion between now and 2025. Governments are also mandating zero trust architectures for federal organizations. These endorsements from the largest organizations have accelerated zero trust adoption across every sector. Moreover, these developments suggest that zero trust will soon become the default security approach for every organization. Zero trust enables organizations to protect their assets by reducing the chance and impact of a breach. It also reduces the average breach cost by at least $1.76 million, can prevent five cyber disasters per year, and save an average of $20.1 million in application downtime costs.


Walls between technology pros and customers are coming down at mainstream companies

Tools assisting with this engagement include "prediction, automation, smart job sites and digital twins," he says. "We have resources in each of our geographic regions where we scale new technology from project to project to ensure the 'why' is understood, provide necessary training and support, and educate teams on how that technology solution makes sense in current processes and day-to-day operations." At the same time, getting technology professionals up to speed with crucial pieces of this customer collaboration -- user experience (UX) and design thinking -- is a challenge, McFarland adds. "There is a widely recognized expectation to create seamless and positive customer experiences. That said, specific training and technological capabilities are a headwind that professionals are experiencing. While legacy employees may be fully immersed and knowledgeable about a certain program and its technical capabilities, it is more unusual to have both the technical and UX design expertise. The construction industry is working to find the right balance of technology expertise and awareness with UX and design proficiencies."


Why Is the Future of Cloud Computing Distributed Cloud?

Distributed cloud freshly redefines cloud computing. It states that a distributed cloud is a public cloud architecture that handles data processing and storage in a distributed manner. Said, a business using dispersed cloud computing can store and process its data in various data centers, some of which may be physically situated in other regions. A content delivery network (CDN), a network architecture that is geographically spread, is an example of a distributed cloud. It is made to send content (most frequently video or music) quickly and efficiently to viewers in various places, significantly lowering download speeds. Distributed clouds, however, offer advantages to more than just content producers and artists. They can be utilized in multiple business contexts, including transportation and sales. It is possible to use a distributed cloud even in particular geographical regions. For instance, a supplier of file transfer services can format video and store content on CDNs spread out globally using centralized cloud resources.


How to Become a Data Analyst – Complete Roadmap

First, understand this, the field of Data Analyst is not about computer science but about applying computational, analysis, and statistics. This field focuses on working with large datasets and the production of useful insights that helps in solving real-life problems. The whole process starts with a hypothesis that needs to be answered and then involvement in gathering new data to test those hypotheses. There are 2 major categories of Data Analyst: Tech and Non-Tech. Both of them work on different tools and Tech domain professionals are required to possess knowledge of required programming languages too (such as R or Python). The working professional should be fluent in statistics so that they can present any given amount of raw data into a well-aligned structure. ... Today, Billions of companies are generating data on daily basis and using it to make crucial business decisions. It helps in deciding their future goals and setting new milestones. We’re living in a world where Data is the new fuel and to make it useful data analysts are required in every sector. 


Software developer: A day in the life

An analytics role will require you to learn new skills continuously, look at things in new ways, and embrace new perspectives. In technology and business, things happen quickly. It is important to always keep up with what is happening in the industries in which you are involved. Never forget that at its core, technology is about problem-solving. Don’t get too attached to any coding language; just be aware that you probably won’t be able to use the language you like, do the refactor you want, or perform the update you expect all the time. The end focus is always on the client, and their needs take priority over developer preferences. Be prepared to use English every day. To keep your skills sharp, read documentation, talk to others often, and watch videos. ... Any analytics professional who is interested in elevating their career should always be attentive to new technologies and updates, become an expert in some specific language/technology, and understand the low level of programming in a variety of languages. Finally, if you enjoy logic, math, and problem-solving, consider a career in software development. The world needs your skills to solve big challenges.



Quote for the day:

"Leadership Principle: As hunger increases, excuses decrease." -- Orrin Woodward