Showing posts with label TDD. Show all posts
Showing posts with label TDD. Show all posts

Daily Tech Digest - November 21, 2025


Quote for the day:

“You live longer once you realize that any time spent being unhappy is wasted.” -- Ruth E. Renkl



DPDP Rules and the Future of Child Data Safety

Most obligations for Data Fiduciaries, including verifiable parental consent, security safeguards, breach notifications, data minimisation, and processing restrictions for children’s data, come into force after 18 months. This means that although the law recognises children’s rights today, full legal protection will not be enforceable until the culmination of the 18-month window. ... Parents’ awareness of data rights, online safety, and responsible technology is the backbone of their informed participation. The government needs to undertake a nationwide Digital Parenting Awareness Campaign with the help of State Education Departments, modelled on literacy and health awareness drives. ... schools often outsource digital functions to vendors without due diligence. Over the next 18 months, they must map where the student data is collected and where it flows, renegotiate contracts with vendors, ensure secure data storage, and train teachers to spot data risks. Nationwide teacher-training programmes should embed digital pedagogy, data privacy, and ethical use of technology as core competencies. ... effective implementation will be contingent on the autonomy, resourcefulness, and accessibility of the Data Protection Board. The regulator should include specialised talent such as cybersecurity specialists and privacy engineers. It should be supported by building an in-house digital forensics unit, capable of investigating leaks, tracing unauthorised access, and examining algorithmic profiling. 


5 best practices for small and medium businesses (SMEs) to strengthen cybersecurity

First, begin with good access control which would entail restricting employees to only the permissions that they specifically require. It is also important to have multi-factor authentication in place, and regularly audit user accounts, particularly when roles shift or personnel depart. Second, keep systems and software current by immediately patching operating systems, applications, and security software to close vulnerabilities before they can be exploited by attackers. Similarly, updates should be automated to avoid human error. The staff are usually at the front line of the defence, so the third essential practice is the continuous ongoing training of employees in identifying phishing attempts, suspicious links, and social engineering methods, making them active guardians of corporate data and effectively cutting the risk of a data breach. Fourth is the safeguarding your data which can be implemented by having regular backups stored safely in multiple places and by complementing them with an explicit disaster recovery strategy, so that you are able to restore operations promptly, reduce downtime, and constrain losses in the event of a cyber attack. Fifth and finally, companies should embrace the layered security paradigm using antivirus tools, firewalls, endpoint protection, encryption, and safe networks. Each of those layers complement each other, creating a resilient defence that protects your digital ecosystem and strengthens trust with partners, customers, and stakeholders.


How Artificial Intelligence is Reshaping the Software Development Life Cycle (SDLC)

With AI tools, workflows become faster and more efficient, giving engineers more time to concentrate on creative innovation and tackling complex challenges. As these models advance, they can better grasp context, learn from previous projects, and adapt to evolving needs. ... AI streamlines software design by speeding up prototyping, automating routine tasks, optimizing with predictive analytics, and strengthening security. It generates design options, translates business goals into technical requirements, and uses fitness functions to keep code aligned with architecture. This allows architects to prioritize strategic innovation and boosts development quality and efficiency. ... AI is shifting developers’ roles from manual coding to strategic "code orchestration." Critical thinking, business insight, and ethical decision-making remain vital. AI can manage routine tasks, but human validation is necessary for security, quality, and goal alignment. Developers skilled in AI tools will be highly sought after. ... AI serves to augment, not replace, the contributions of human engineers by managing extensive data processing and pattern recognition tasks. The synergy between AI's computational proficiency and human analytical judgment results in outcomes that are both more precise and actionable. Engineers are thus empowered to concentrate on interpreting AI-generated insights and implementing informed decisions, as opposed to conducting manual data analysis.


Innovative Approaches To Addressing The Cybersecurity Skills Gap

In a talent-constrained world, forward-leaning organizations aren’t hiring more analysts—they’re deploying agentic AI to generate continuous, cryptographic proof that controls worked when it mattered. This defensible automation reduces breach impact, insurer friction and boardroom risk—no headcount required. ... Create an architecture and engineering review board (AERB) that all current and future technical designs are required to flow through. Make sure the AERB comprises a small group of your best engineers, developers, network engineers and security experts. The group should meet multiple times a year, and all technical staff should be required to rotate through to listen and contribute to the AERB. ... Build security into product design instead of adding it in afterward. Embed industry best practices through predefined controls and policy templates that enforce protection automatically—then partner with trusted experts who can extend that foundation with deep, domain-specific insight. Together, these strategies turn scarce talent into amplified capability. ... Rather than chasing scarce talent, companies should focus on visibility and context. Most breaches stem from unknown identities and unchecked access, not zero days. By strengthening identity governance and access intelligence, organizations can multiply the impact of small security teams, turning knowledge, not headcount, into their greatest defense.


The Configurable Bank: Low‑Code, AI, and Personalization at Scale

What does the present day modern banking system look like: The answer depends on where you stand. For customers, Digital banking solutions need to be instant, invisible, and intuitive – a seamless tap, a scan, a click. For banks, it’s an ever-evolving race to keep pace with rising expectations. ... What was once a luxury i.e. speed and dependability – has become the standard. Yet, behind the sleek mobile apps and fast payments, many banks are still anchored to quarterly release cycles and manual processes that slow innovation. To thrive in this landscape, banks don’t need to rip out their core systems. What they need is configurability – the ability to re-engineer services to be more agile, composable, and responsive. By making their systems configurable rather than fixed, banks can launch products faster, adapt policies in real time, and reduce the cost and complexity of change. ... The idea of the Configurable Bank is built on this shift – where technology, powered by low-code and AI, transforms banking into a living, adaptive platform. One that learns, evolves, and personalizes at scale – not by replacing the core, but by reimagining how it connects with everything around it. ... This is not just a technology shift; it’s a strategic one. With low-code, innovation is no longer the privilege of IT alone. Business teams, product leaders, and even customer-facing units can now shape and deploy digital experiences in near real time. 


Deepfake crisis gets dire prompting new investment, calls for regulation

Kevin Tian, Doppel’s CEO, says that organizations are not prepared for the flood of AI-generated deception coming at them. “Over the past few months, what’s gotten significantly better is the ability to do real-time, synchronous deepfake conversations in an intelligent manner. I can chat with my own deepfake in real-time. It’s not scripted, it’s dynamic.” Tian tells Fortune that Doppel’s mission is not to stamp out deepfakes, but “to stop social engineering attacks, and the malicious use of deepfakes, traditional impersonations, copycatting, fraud, phishing – you name it.” The firm says its R&D team has “just scratched the surface” of innovations it plans to bring to existing and upcoming products, notably in social engineering defense (SED). The Series C funds will “be used to invest in the core Doppel gang to meet the exponential surge in demand.” ... Advocating for “laws that prioritize human dignity and protect democracy,” the piece points to the EU’s AI Act and Digital Services Act as models, and specifically to new copyright legislation in Denmark, which bans the creation of deepfakes without a subject’s consent. In the authors’ words, Denmark’s law would “legally enshrine the principle that you own you.” ... “The rise of deepfake technology has shown that voluntary policies have failed; companies will not police themselves until it becomes too expensive not to do so,” says the piece.


The what, why and how of agentic AI for supply chain management

To be sure, software and automation are nothing new in the supply chain space. Businesses have long used digital tools to help track inventories, manage fleet schedules and so on as a way of boosting efficiency and scalability. Agentic AI, however, goes further than traditional SCM software tools, offering capabilities that conventional systems lack. For instance, because agents are guided by AI models, they are capable of identifying novel solutions to challenges they encounter. Traditional SCM tools can’t do this because they rely on pre-scripted options and don’t know what to do when they encounter a scenario no one envisioned beforehand. AI can also automate multiple, interdependent SCM processes, as I mentioned above. Traditional SCM tools don’t usually do this; they tend to focus on singular tasks that, although they may involve multiple steps, are challenging to automate fully because conventional tools can’t reason their way through unforeseen variables in the way AI agents do. ... Deploying agents directly into production is enormously risky because it can be challenging to predict what they’ll do. Instead, begin with a proof of concept and use it to validate agent features and reliability. Don’t let agents touch production systems until you’re deeply confident in their abilities. ... For high-stakes or particularly complex workflows, it’s often wise to keep a human in the loop.


How AI can magnify your tech debt - and 4 ways to avoid that trap

The survey, conducted in September, involved 123 executives and managers from large companies. There are high hopes that AI will help cut into and clear up issues, along with cost reduction. At least 80% expect productivity gains, and 55% anticipate AI will help reduce technical debt. However, the large segment expecting AI to increase technical debt reflects "real anxiety about security, legacy integration, and black-box behavior as AI scales across the stack," the researchers indicated. Top concerns include security vulnerabilities (59%), legacy integration complexity (50%), and loss of visibility (42%). ... "Technical debt exists at many different levels of the technology stack," Gary Hoberman, CEO of Unqork, told ZDNET. "You can have the best 10X engineer or the best AI model writing the most beautiful, efficient code ever seen, but that code could still be running on runtimes that are themselves filled with technical debt and security issues. Or they may also be relying on open-source libraries that are no longer supported." ... AI presents a new raft of problems to the tech debt challenge. The rising use of AI-assisted code risks "unintended consequences, such as runaway maintenance costs and increasing tech debt," Hoberman continued. IT is already overwhelmed with current system maintenance.


The State and Current Viability of Real-Time Analytics

Data managers now prefer real-time analytical capabilities built within their applications and systems, rather than a separate, standalone, or bolted-on proj­ect. Interest in real-time analytics as a standalone effort has dropped from 50% to 32% during the past 2 years, a recent survey of 259 data managers conducted by Unisphere Research finds ... So, the question becomes: Are real-time analytics ubiqui­tous to the point in which they are automatically integrated into any and all applications? By now, the use of real-time analyt­ics should be a “standard operating requirement” for customer experience, said Srini Srinivasan, founder and CTO at Aero­spike. This is where the rubber meets the road—where “the majority of the advances in real-time applications have been made in consumer-oriented enterprises,” he added. Along these lines, the most prominent use cases for real-time analytics include “risk analysis, fraud detection, recommenda­tion engines, user-based dynamic pricing, dynamic billing and charging, and customer 360,” Srinivasan continued. “For over a decade, these systems have been using AI and machine learning [ML], inferencing for improving the quality of real-time deci­sions to improve customer experience at scale. The goal is to ensure that the first customer and the hundred-millionth cus­tomer have the same vitality of customer experience.” ... “Within industries such as energy, life sciences, and chemicals, the next decade of real-time analytics will be driven by more autono­mous operations,” said David Streit


You Down with EDD? Making Sense of LLMs Through Evaluations

We're facing a major infrastructure maturity gap in AI development — the same gap the software world faced decades ago when applications grew too complex for informal testing and crossed fingers. Shipping fast with user feedback works early on, but when done at scale with rising stakes, "vibes" break down and developers demand structure, predictability, and confidence in their deployments. ... AI engineering teams are turning to an emerging solution: evaluation-driven development (EDD), the probabilistic cousin to TDD. An evaluation looks similar to a traditional software test. You have an assertion, a response, and pass-fail criteria, but instead of asking "Does this function return 42?" you're asking "Does this legal AI application correctly flag the three highest-risk clauses in this nightmare of a merger agreement?" Our trust in AI systems comes from our trust in the evaluations themselves, and if you never see an evaluation fail, you're not testing the right behaviors. The practice of Evaluation-Driven Development (EDD) is about repeatedly testing these evaluations. ... The technology for EDD is ready. Modern AI platforms provide solid evaluation frameworks that integrate with existing development workflows, but the challenge facing wide adoption is cultural. Teams need to embrace the discipline of writing evaluations before changing systems, just like they learned to write tests before shipping code. It requires a mindset shift from "move fast and break things," to "move deliberately and measure everything."

Daily Tech Digest - January 30, 2019

Cisco serves up flexible data-center options
Cisco has now extended ACI with ACI Anywhere to the cloud – specifically Amazon AWS and Microsoft Azure environments. The idea is that customers will have the flexibility to run and control applications anywhere they want across private or public clouds or at the edge and while maintaining consistent network policies across their entire domain. “There is nothing centered about data centers anymore,” said Roland Acra, senior vice president and general manager for Cisco’s Data Center Networking business. “IT teams have been forced to make a hard choice: stay with their on-premises data centers with a rich set of tools of their choice for automation or assurance or security; or move to the cloud, where a different set of capabilities can make consistent compliance a true challenge. ACI Anywhere removes that challenge and places workloads where it makes the most sense regardless of the platform or hypervisor.” ACI Anywhere would, for example, let policies configured through Cisco’s SDN APIC use native APIs offered by a public-cloud provider to orchestrate changes within both the private and public cloud environments, Cisco said.


Unconfigured IoT is a security risk, warns researcher

Many IoT devices work initially in an access point mode, so users can connect to the device using a smartphone to reconfigure it to become a client on the wireless network by entering the network security key, thereby making it much more secure. But businesses and consumers will often elect not to connect appliances to the internet, believing this is safer. ...  “This means that if the device remains unconfigured, it will remain in the default state, making it even more vulnerable than if it were connected to the internet and configured,” said Munro. “Although this opens up another set of vulnerabilities, organisations and consumers are becoming increasingly aware of these vulnerabilities and are therefore more likely to be aware of the risks and how to mitigate them.” But with an unconfigured device, attackers could use a war driving or access mapping attack, which would make it easy to compromise these devices, said Munro, because the attacker could identify a target wireless network using a geolocation site, such as wigle.net, that shows wireless access points in any given location and enables account holders to search its database for unconfigured IoT devices.


Serverless computing’s dark side: less portability for your apps

Serverless computing̢۪s dark side: less portability for your apps
How that serverless development platforms calls into your serverless code can vary, and there is not uniformity between public clouds. Most developers who develop applications on serverless cloud-based systems couple their code tightly to a public cloud provider’s native APIs. That can make it hard, or unviable, to move the code to another platforms. The long and short of this that if you build an application on a cloud-native serverless system, it’s both difficult to move to another cloud provider, or back to on-premises. I don’t mean\ to pick on serverless systems; they are very handy. However, more and more I’m seeing enterprises that demand portability when picking cloud providers and application development and deployment platforms often opt for what’s fastest, cheapest, and easiest. Portability be dammed. Of course, containers are also growing by leaps and bounds, and one of the advantages of containers is portability. However, they take extra work, and they need to be built with a container architecture in mind to be effective.


Success or Burnout? Q&A on How Personal Agility Can Help

Personal Agility is a simple coaching framework; it is based on just six powerful questions, a weekly event for asking the questions, and an “information radiator” to help you understand and act upon the answers. You can do it yourself without needing agreement or permission from anyone else! The key question “What really matters?” provides guidance for deciding how to spend your time. The next question, “What did you accomplish last week?” helps you understand where you are and to feel good about yourself and what you’ve done! The next questions help you to figure out what is (or is not) important to do this week. “What could you do?” looks at possibilities; “Of those things, which are important or urgent?” helps you to identify the essentials; finally, “Which ones do you want to get done this week?” helps you set a course with realistic objectives, so you can make steady progress to achieve bigger goals. Finally “Who can help?” is a classic coaching question that helps you get unstuck.


IT leaders must address integration to support business ecosystem


The survey found that almost half (48%) of organisations want to modernise their IT in order to compete more effectively in today’s digital business landscape. Respondents said modernisation is key to consolidating disparate technologies, automating data transaction processes and gaining visibility into their critical data flows. However, the research found that modernisation is one of the enterprise’s biggest challenges. According to Cleo, while the surveyed IT decision-makers understand the limitations and high maintenance cost of legacy technologies, they also recognise the systems’ importance to day-to-day operations. In Cleo’s experience, a major part of digital transformation is balancing old and new technologies, which means integrating legacy systems with modern applications cost-effectively and without disruption. For this reason, enterprises must simultaneously maintain legacy systems while adopting newer cloud services and software-as-a-service (SaaS) solutions to engage in and support how business is done today, it said.


How to Estimate Software Projects in A Test-Driven Development Environment

A good project manager intentionally limits the amount of information available to participants for discussion. The less information is provided, the lower the chance of an error. If we look back at the above description, what’s in it for us in it? First, it helps us define the user. In our case, it’s a registered user who has previously placed an order on the website. Second, the required functionality should have time and data limitations. Third and very importantly, the action that the user performs is atomic. Sequences or non-linear sequences of actions indicated in the description of the functionality are the roads straight to hell. And for all the participants involved, not just for the customer! Subjectively speaking, the ideal user stories imply that the user needs a minute or less to become aware of how to perform this or that action. In this case, by “aware” we mean that a user has already performed the same or very similar action in a different application.


Japan's IoT Security Strategy: Break Into Devices

Japan's IoT Security Strategy: Break Into Devices
Identifying potentially vulnerable IoT devices that face the internet can be accomplished using search engines such as Shodan, which allow for search queries based on certain parameters. Once a device has been found, taking it to the next level - attempting to log into the device - is generally a criminal offense in most countries. That presumably is the case in Japan as well and the reason why the law had to be modified to make it legal for the survey (see: Could a Defensive Hack Fix the Internet of Things?). With the law changed and permission to proceed, it should be easier to identify vulnerable devices. The larger problem is trying to resolve the vulnerabilities. Fixing vulnerabilities that lead to large botnets has been vexing. A decade ago, attackers commandeered large networks of desktop computers via browser and operating system vulnerabilities. Law enforcement agencies and private companies found success in shutting down the command-and-control servers for those botnets. But it left the problem of cleaning up infected devices, which usually involved the owners of those devices installing security patches.


CEOs and software

Neither software leaders nor CIOs can catapult their software organizations into the digital era without the right CEO support. CEO actions, or lack thereof, can stymie progress toward the software capability that digital business demands. Why? Software success depends on factors that only CEOs control. CEO control starts with funding for software initiatives — buy, build, and everything in between, plus modernization of outdated software. We track software leaders’ views on the top 10 barriers to improved software delivery (see Figure 1), with the barriers owned by CEOs highlighted in red. ... Software Delivery Speed Is Stuck“Things are moving so fast in our market,” said the CEO of a professional services firm. “I live in terror of being left behind.” Speed of software delivery is a leading indicator of health and vitality in a software-delivery organization and a signal that a software team’s digital transformation is underway. During the past five years, developers have made almost no progress in their ability to deliver software quickly


How traffic scrubbing can guard against DDoS attacks


A growing number of enterprises are investing in DDoS solutions, especially cloud-based DDoS mitigation services, with a shift away from a service-provider-centric market. A DDoS attack is one of the most complex threats that businesses can face. The goal of the individual hacker, organised criminals or state actors is to overwhelm a company’s network, website or network component, such as a router. To begin with, organisations have to determine whether a spike in traffic is legitimate or is an attack. “Without a solid understanding of baselines and historic traffic trends, organisations are unlikely to detect an attack until it is too late,” said Sherrel Roche, senior market analyst at IDC’s Asia-Pacific business and IT services research group. Landbank, the largest government-owned bank in the Philippines, has taken the step of implementing F5’s BIG-IP local traffic manager to understand its application traffic and performance better, as well as to gain full visibility into customer data as it enters and leaves an application. This enables the security team to inspect, manage and report fraudulent transactions as soon they are spotted.


DevOps Adoption Practices

Many organizations start with an environment that is full of variables: different processes, different environments, different tools, and several permutations of configurations and data. All this makes automation hard and reduces your ability to learn as each variable could be the cause of the problem. The first step is to look at all those variables and see what you can remove. Can you align the patch levels across environments? Can you deploy the same version of the application across environments? Some variables can only be removed later on, but understanding what all the variable pieces are and doing a clean-up first will make later efforts easier. ... Someone once told me: "You cannot automate what you cannot document." After all, automation is a form of documentation of a process. What is even more important is that automating a bad process just creates more problems. I also think that writing down a solution forces you to think it through in a way that verbal communication or just starting to write code does not.



Quote for the day:


"A leadership disposition guides you to take the path of most resistance and turn it into the path of least resistance." -- Dov Seidman


Daily Tech Digest - May 02, 2018

Next Port of Call — Digitization of Automotive Retail

Image Attribute: Inside a car showroom / Source: Mercedes-Benz of Encino/Flickr
As per the Cox Automotive's survey, for every retail sale, customers visit the auto dealer only two to three times (at maximum), including to sign the contract and to take the custody of the vehicle. However, the consumers are also taking the unbeaten path - like - initiating the buying process online by “building a vehicle” to their specifications and then searching inventory in a specific geography. The buyer evaluates their current vehicle’s trade-in value based on its model, option content, age, and condition. The financial institution (either traditional bank or newer online lenders) reviews, selects and approves financing and the consumer’s choice of purchase or lease in real time. Then the purchase process shifts from digital to more traditional retail, when the consumer arrives at the dealership to test drive the vehicle and sign the necessary paperwork to take ownership. Some dealers, taking advantage of their close proximity to the customer, further emulate the new online purchasing model by delivering the vehicle directly to the customer’s home at no charge.


Resolving who actually owns security in agile development

As we know, the developers’ main focus is getting a working product out the door as fast as possible, while the security folks want to reduce the chances that the product will contain vulnerabilities. Ideally, the developers would be able to code without any interruption or inference from the security folks. However, since developers are only human, there will always be flaws in the code that they write themselves, as well as issues in the code that they take from third-parties like open source repositories from sources like GitHub. We know that it is cheaper in terms of time and money to catch and fix vulnerabilities early in the process rather than later, especially when your developers have built more features on top of imperfect code. Moreover, we see a bottleneck occurring when security issues are left unaddressed until a short while before release (when stress levels are particularly high).


Shifting a Corporate Culture at Scale — and with Speed


Speed was very important in decision making. The culture of the prior organization was to extensively “discuss and deliberate.” As an example, the first meeting I was at had 25 people. My first call had 100 people. People were coming into meetings who were not necessarily contributing but who were transcribing and communicating to others; they weren’t the people who were supposed to take the action. One of the first meetings I had on July 14 was a review of the business. I had a stack of paper on one side, a stack of paper on the other side. I said, “I’m going to make a policy decision: no more paper.” And of course, I got a call that evening, saying, “Hey, I don’t know if you are aware of the fact you work for Xerox.” And I said, “Oops.” I said no more paper because the idea is to quickly convert people from the previous approach. People are showing up, and they’re basically reading off the presentation. So we changed that. But organizational structure is the clearest way to inform you as to how successful you will be.


State of Cybersecurity 2018: Enterprises Can Do Better

It seems that over the past 12 months, security has slipped down the boardroom agenda. According to the survey results, only 20% of organizations have their security function reporting to the CEO or main board. This represents an even lower figure than the 24% from last year (although the question in the previous year was phrased slightly differently). Also, 57% of the practitioners surveyed believed that their main board was adequately supporting security initiatives, a 10% decrease from the 67% figure from the previous year. On the bright side, 64% of enterprises were expecting to increase their cybersecurity budget this year, which also means that in 36% of enterprises, the expectation is to make do with the same or less money on their security efforts. That is an improvement over last year (where only 50% of respondents expected a security budget increase) but still shows a degree of complacency or risk-optimism in a sizable number of organizations.


Rip and replace your RDBMS? No – build cloud apps instead.

man-changes
“Customer success” is not just a nice way to give a new name to services. It is very much a mindset and a model that says you have to really understand what your customer is trying to achieve." That advice also means architecture planning, tying DataStax into an array of tools and vendors, from the storage layer to the security layer to the middleware layer: “How we interact and engage with our partners is all very important.” So is this a revenue play for DataStax, or is it about solidifying the customer relationship and making sure the projects deliver? Bosworth says it’s very much the latter. Without opening the entire financial kimono, he offered this: "We don’t share a lot of financial information. One thing I can tell you is our gross margins run north of 75 percent – that’s our blended gross margin as a company. That’s really how you can figure out if a company is a services company or a software company. Certainly anything upwards of 70 percent puts you in the software category. … kind of time-to-impact if you will."


University of San Francisco GE Digital Transformation Case Study

“Improving the productivity of existing assets by even a single percentage point can generate significant benefits in the oil and gas sector (and in other sectors). “The average recovery rate of an oil well is 35%, meaning 65% of a well’s potential draw is left in the earth because available technology makes it too expensive,” explains Haynes-Gaspar. “If we can help raise that 35% to 36%, the world’s output will increase by 80 billion barrels — the equivalent of three years of global supply. The economic implications are huge.” GE bet big on the Industrial Internet. The company put sensors on all of their products including gas turbines, jet engines, and other machines; connecting them to the cloud; and analyzing the resulting flow of data. The goal: identify ways to improve machine productivity and reliability. And it didn’t take long for GE engineers to realize that they could find interesting and unique patterns in the data.


Car hackers find remotely exploitable vulnerabilities in Volkswagen and Audi vehicles

Car hackers find remotely exploitable vulnerabilities in VW, Audi cars
The researchers noted, “Based on our experience, it seems that cars which have been produced before are not automatically updated when being serviced at a dealer, thus are still vulnerable to the described attack.” I encourage you to read their research paper, which delves into their attack strategy and technical system details, but it does not fully disclose the details of the remotely exploitable vulnerability because that, they believe, would be “irresponsible.” The researchers said they want to protect future cars but ask, “What about the cars of today or cars that were shipped last week? They often don’t have the required capabilities (such as over-the-air updates) but will be on our roads for the next fifteen years. We believe they currently pose the real threat to their owners, having drive-by-wire technology in cars that are internet-connected without any way to reliably update the entire fleet at once.” The hacked car models were from 2015, so if you have an Audi or Volkswagen, then contact to your dealer and ask about a software update.


Collaboration with utilities seen as first step in growth of smart cities

Berst said cities can invest in becoming a smart city in small ways. From installing smart street lights to putting in solar rooftops and other distributed renewable energy sources, to providing residents with electric car charging stations, cities can not only provide a more environmentally friendly atmosphere, but also save money. Installing smart street lights, such as through the Urbanova initiative for example, can save a city millions in electricity costs. “Smart street lights have a pay-off of three years or less. It’s one of the lesser expensive on-ramps that can lead to a deeper collaboration,” Berst said. “While those trucks are there installing the LED street lights, why not have them snap in a communications network into that existing infrastructure while they are up there? Now, not only do you have smart street lights, but an entire communications network as well.”


A Quick Guide to Implementing ATDD


Collaboration is one of the core values of the Agile methodology. Once, as I was working on a large project, I noticed a lack of collaboration between developers, testers, and business-minded individuals; a lack of clarity in requirements; frequent requirements scope-creep; a lack of visibility in regards to the testing completed; and defects being identified late in the project lifecycle. Most importantly to me, though, was that no one had any idea about our automation framework, so all of the automation tests were written after the features were developed and ready for testing. ... As a result, I found Acceptance Test Driven Development (ATDD) as one of the approaches used to mitigate many of these issues. It is often used synonymously with Behavior Driven Development (BDD), Story Test Driven Development (SDD) and Specification By Example (SBE). The main distinction of ATDD, as opposed to other agile approaches, is its focus on making developers, testers, business people, product owners and other stakeholders collaborate as one unit and create a clear understanding of what needs to be implemented.


At Interop: Everyone Into the AI Pool

"Now is the time to proactively look for problems where you can apply this. Yes, I think it's that important," he said, adding that you could toss a dart at a company org chart and find an area that could benefit from AI. Helping to identify the problems to be solved, and the type of improvement -- be it a new product or service, or a process improvement -- that should result is where business leaders need to work with technologists and data scientists to match the goals with technology capabilities. Putting AI and machine learning into action is where David Karandish, founder and CEO of Ai Software, took over. There's been plenty of discussion about how to use intelligent assistants or agents in the corporate world, taking a step beyond the bots that have popped up on websites in recent years. Karandish introduced the audience to his company's "Jane", a chat-based assistant that answers questions for employees and customers when integrated with a client company's internal systems. It's in use at several client companies besides his own.



Quote for the day:


"Knowledge is like underwear. It is useful to have it, but not necessary to show it off." -- Bill Murray


Daily Tech Digest - April 19, 2018

5G Security Challenges and Ways to Overcome Them


5G is on its way to serve vertical industries, not just individual customers who are more bothered about experiencing a faster mobile network or richer smart phone functionalities. When it comes to serving vertical industries, security requirements may vary from one service to the other. As the Internet of Things (IoT) continues to gain momentum, more people will be able to remotely operate networked devices and this will surely call for the deployment of a stricter user-authentication method to prevent unauthorized access to IoT devices. For example, biometric identification systems can be installed in smart homes. ... 5G networks are believed to be enhanced by the deployment of new cost-effective IT technologies such as virtualization and Software Defined Network (SDN)/Network Functions Virtualization (NFV). However, 5G services can be equipped with appropriate security mechanisms only if the network infrastructure is robust enough to support the security features. The security of function network elements, in legacy networks, depends, to a large extent, on how well their physical entities could be separated from each other.


Broadband
As IoT devices grow in popularity, it creates a greater security vulnerability for consumers. Service providers and consumer electronics manufacturers can now leverage the USP standard to perform lifecycle management of connected devices and carry out upgrades to address critical security updates. Newly installed or purchased devices and virtual services can also be easily added, while customer support is improved by remote monitoring and troubleshooting of connected devices, services and home network links. Additionally, the specification enables secure control of IoT, smart home and smart networking functions and helps map the home network to manage service quality and monitor threats. Work on the USP specification was carried out by the Broadband User Services (BUS) Work Area, which is led by Co-Directors John Blackford of Arris, who is also a Broadband Forum board member, and Jason Walls of QA Cafe. AT&T, Axiros, Google, Greenwave Systems, Huawei, NEC, Nokia, and Orange also participated in developing USP.



Notes from the AI frontier: Applications and value of deep learning

Notes from the AI frontier: Applications and value of deep learning
Neural networks are a subset of machine learning techniques. Essentially, they are AI systems based on simulating connected “neural units,” loosely modeling the way that neurons interact in the brain. Computational models inspired by neural connections have been studied since the 1940s and have returned to prominence as computer processing power has increased and large training data sets have been used to successfully analyze input data such as images, video, and speech. AI practitioners refer to these techniques as “deep learning,” since neural networks have many (“deep”) layers of simulated interconnected neurons. ... Deep learning’s capacity to analyze very large amounts of high dimensional data can take existing preventive maintenance systems to a new level. Layering in additional data, such as audio and image data, from other sensors—including relatively cheap ones such as microphones and cameras—neural networks can enhance and possibly replace more traditional methods. AI’s ability to predict failures and allow planned interventions can be used to reduce downtime and operating costs while improving production yield.


From BDD to TDD, the pros and cons of various agile techniques

citizen developers
Distributed agile makes it possible to escape any constraints of space or skills and experience in your immediate location. Modern collaboration tools like Slack, Skype, Teams, and Hangouts have made this possible. You can actually work together on stories without being in the same place and ask questions without disturbing your coworkers’ flow. Trust, rapport and communication are still essential. That’s why distributed agile works best when you have at least two teammates in any given location, they meet face to face periodically, and understand each other’s language and culture well. It’s helpful to have the whole team within a short flight and similar time zones so you can easily collaborate physically as well as virtually when needed. That team solidarity makes all the difference when you’re trying to crack a tough problem, get business or user feedback, or just onboard new team members. Agile works best when there is fast, frequent communication through standups and other formal and informal collaboration.


The evolution of forensic investigations


Protecting data, intellectual property (IP), and finances has become an increasing priority at the board room level as fraudsters proliferate and constantly adapt to more sophisticated controls and monitoring. While most organizations are susceptible to seemingly boundless criminal ingenuity, those lacking antifraud controls are predictably worse off, suffering twice the median fraud losses of those with controls in place. However, even organizations with antifraud controls can have their investigative efforts impeded by several factors. Reliance on rules-based testing is a primary culprit. Rules-based tests typically assess and monitor fraud risks across a single data set, giving only a yes or no answer. Information silos further impede analytics-aided investigative efforts. Organizations often struggle to balance the need for locally-tailored processes with the potential benefits of integrated data sharing, unintentionally creating barriers to investigative exploration as a result. The vast and growing volumes of unstructured data amassing in organizations, such as videos, images, emails, and text files.


City & Guilds Group deploys SD-WAN to improve Office 365 performance

City & Guilds Group deploys SD-WAN to improve Office 365 performance
It’s a different story, though, for workers located remotely like in the Asia-Pacific region. For those individuals, the experience can be very frustrating. I have first-hand experience with this. Prior to being an analyst, I spent some time as a consultant, and I remember trying open PowerPoint and Word documents out of region and it would often take minutes. Sometimes the process would go “not responding,” necessitating the need to shut down the application and start over. The most frustrating part was that there was no way of telling whether the file was still being downloaded or if the process died. I would often “open” the files and then go do something else for a while and come back and hope they finished opening. Bandwidth speeds have increased, but so have the size of Office documents. This is the situation that remote City & Guilds workers were facing. For example, users in Wellington, New Zealand, saw extremely slow response times when accessing files from the corporate Share Point drive, leading to a number of user complaints and a loss of productivity.


Google Cloud speech-to-text service gets revamp


In the future, enterprises will be able to feed automatically generated transcripts of business conversations into virtual assistants like IBM Watson or Google Assistant, helping those machines learn how to assist workers or customers better. "If you have your VP of marketing provide an overview of what a particular product does, that video is captured, that audio is converted into text, that text becomes searchable, and, ultimately, that text can be fed into machine intelligence systems," Vonder Haar said. Vendors are continually improving their speech-to-text tools, but enterprises shouldn't wait until those platforms are perfect before experimenting with them, said Jon Arnold, principal of Toronto-based research and analysis firm J Arnold & Associates. "To me, the big takeaway is these platforms definitely provide a lot of exciting possibilities," Arnold said. "Do some harmless in-house trials, get a feel for it, because the use cases will come out of the woodwork once you start getting comfortable with it."


15 Ways To Build Security Into Your Development Process


Knowing where to focus your likely very limited resources is key, and can be tackled by performing application risk assessments and threat modeling. By better understanding where your product or service may have unacceptable risk exposure, you can focus your time and resources appropriately. - Vijay Bolina, Blackhawk Network  As with any collaborative endeavor that brings together people from different backgrounds, experiences and outlooks, it’s important to acknowledge the possibility of conflict up front and deal with it head-on. Senior leaders should be involved to explain why the DevSecOps ethos is so vital to the company’s future, and hold everyone accountable for advancing its success. - Todd DeLaughter, Automic Software, owned by CA Technologies (NASDAQ: CA) One of the most effective ways to embed security into software is to initiate the security on boot-up. When a user restarts their device or software, the manufacturer should run a series of boot tests to determine any changes in the software and that the software is entirely authentic.


Beyond Java: Programming languages on the JVM

Beyond Java: Programming languages on the JVM
If there is any language that is a known and proven quantity for developers, it’s Java. Enterprise developers, web developers, mobile developers, and plenty of others besides, have made Java ubiquitous and contributed to the massive culture of support around Java. What’s more, the Java runtime, or Java Virtual Machine (JVM), has become a software ecosystem all its own. In addition to Java, a great many other languages have leveraged the Java Virtual Machine to become powerful and valuable software development tools in their own right. Using the JVM as a runtime brings with it several benefits. The JVM has been refined over multiple decades, and can yield high performance when used well. Applications written in different languages on the JVM can share libraries and operate on the same data structures, while programmers take advantage of different language features. Below we profile several of the most significant programming languages created for the JVM. 


Microservices Communication and Governance Using Service Mesh


A service mesh is an infrastructure layer for service-to-service communication. It ensures reliable delivery of your messages across the entire system and is separate from the business logic of your services. Service meshes are often referred to as sidecars or proxies. As software fragments into microservices, service meshes go from being nice-to-have to essential. With a service mesh, not only will you ensure resilient network communications, you can also instrument for observability and control, without changing the application run-time. ... In the direct interpretation it could be used to describe both the network of microservices that make up distributed applications and the interactions between them. However, recently the term has been mostly applied to a dedicated infrastructure layer for handling service-to-service communication, usually implemented as lightweight network proxies (sidecars) that are deployed alongside application code. The application code can treat any other service in the architecture as a single logical component running on a local port on the same host.



Quote for the day:


"You never will be the person you can be if pressure, tension and discipline are taken out of your life." -- Dr James G Bilkey


March 08, 2016

Use a BPM strategy to modernize legacy applications

As is nearly always the case, enterprise architecture may provide an easy path if an "EA model" is available. It would be fair to say that for a major enterprise to modernize legacy applications on a large scale, it should never proceed without first developing an EA model according to one of the established standards such as TOGAF. Where the scope of application modernization projects is more limited, it's possible to recover business process definitions from current applications. Where you have no EA framework for direct BPM mapping, take application workflows and "abstract" them by grouping application features into the business processes they support.


The Other Side of Agile: Ceremonial Development

As you can see, ironically, the Agile Manifesto is very simple. Good Agile practices are much more in the spirit of Kaizen and continuous improvement, as opposed to the sterile doctor prescription of do’s and don’ts that most people associate with Agile. And when I come to realize it, the most successful teams that I’ve worked with have excelled exactly at this — responding and adapting to change. These teams were great at what they did because they had mechanisms in place for the team to continuously improve its own delivery. Truth be told, they weren’talways great because of their code reviews. Or pair-programming. Or Stand-up meetings. Or user stories. These things were sometimes very important in the delivery, but once something becomes routine, it can be hard to take a step back and evaluate if it is still delivering on its value proposition.


Seagate Reveals World's Fastest SSD

Seagate's new SSD is based on the non-volatile memory express (NVMe) interface, which was developed by a cooperative of more than 80 companies and released in March 2011. The NVMe specification defined an optimized register interface, command set and feature set for SSDs using the PCIe interface -- a high-speed serial computer expansion bus standard used in both enterprise and client systems. Intel's SSD 750 series drive, which also uses the NVMe/PCIe interface. The SSD sports read speeds of up to 2,500MB per second or 2.5GB per second. "The unit could be used in an all-flash array or as an accelerated flash tier with hard-disk drives (HDDs) for a more cost-effective hybrid storage alternative," Seagate stated in a news release about the new SSD.


Interview: Laura Galante, FireEye

“How are we not able to solve this problem? Because we don’t have visibility into it? The suspicion is the data is probably sitting there in the private sector because everyone is feeling this too. The perfect marriage was Mandiant sitting there with all of this investigation data and thinking, what if there is something huge here and IP is going out the door? We didn’t know how to think about it, and Mandiant needed intelligence so they hired a few of us out of government to figure out what the data was, how to model and analyse it and that is just what we did.” Galante worked on the APT1 report that was released in February 2013, and this allowed her to see network data on the host side and not just on the network, and understand what malware is sitting there that sends out these alerts.


Breaking the Glass Ceiling in Indian IT Firms

It is not uncommon for women to face unconscious biasness at work, which may impact them negatively and make them feel out of place in a male largely male dominated industry like technology. For instance, unconscious bias can happen when male team members put in long working hours for a project while the female workers may leave the office at fixed times. This can be misconstrued as the male workers contributing more to the project, whereas in reality, both male and female employees could be contributing the same, or the latter even more for that matter. Organizations are now actively working towards mitigating gender bias and bring in more transparency that would make women feel more inclusive. 


Bimodal IT strategy opens up opportunities for innovation

Today's application lifecycle is measured in weeks, not years, meaning neither customers nor employees have the patience for a lengthy software development process. Organizations that are too slow to capitalize on an emerging digital business opportunity lose out to competitors that move quickly. But such a quick process requires using Agile development practices, fostering close cooperation between developers and IT operations, heavily instrumenting applications to measure performance, feature usage and errors, and employing continuous delivery processes that facilitate a steady stream of bug fixes and feature enhancements.


Scrum is Just a Starting Point | The Clever PM

There is certainly value to be had in looking to prescriptive definitions like those found in the Scrum Guide — they provide us all with a common understanding of the component parts of what that particular publisher or consultancy has defined as “Scrum”. It enables us to have intelligent conversations using such jargon words as Product Manager, Scrum Master, Stand Up, Retrospective, and other terms that have only contextual meaning within the world of Scrum. It also provides those who need guidance and assistance in establishing the foundation for Agile practices with some clearly-defined, specifically-actionable, and proven steps to take and ceremonies to implement to achieve their goals.


DHL Asia-Pacific Innovation Centre incubates future logistics technology

“The innovation agenda is not a new one for DHL,” said Mei Pang, vice-president, innovation, solution delivery and service management at DHL customer solutions and innovation in Asia-Pacific. “From an operational point of view, DHL has always known to come out with new things. In 2007, our corporate office in Germany made a decision to invest in a central team to focus on innovation to look at the future of logistics and identify major trends,” Pang told Computer Weekly. “Part of the initiative was to open a conversation with partners, and the approach we take is a very collaborative one where we work with suppliers, customers and academics to focus on the use cases and try to make them practically applicable in our business,” she added. “That concept worked very well in Germany.”


Intel's Pentium Bug Fix Is Proposed as Solution for Dark Pools

The pitch comes as banks have been beset by fines. UBS was fined $14.4 million by the SEC for problems at its private stock-trading platform. Barclays Plc and Credit Suisse Group AG racked up more than $154 million to settle allegations that they misled investors about how their dark pools were managed. Investment Technology Group Inc. agreed to pay $20.3 million for its infractions. Aesthetic Integration was founded by Denis Ignatovich, formerly head of the central risk trading desk at Deutsche Bank AG in London, and Grant Passmore, a mathematician and expert on formal verification.  Passmore said formal verification uses algorithms to analyze other algorithms. Rather than endlessly trying to test possible outcomes, machine reasoning acts like an automated mathematician, creating proofs and theorems to speed up the work.


Testers in TDD teams

The big QA of the Nineties seems history. Many IT organizations have dissolved their QA departments and have spread their testers over Agile teams. However, in many of those teams, the testers are still doing the same manual testing they did in the nineties. Many organizations are therefore still stuck with the same dysfunctional testing they had twenty years ago. The dysfunctionality of Old school QA lies in its excessive use of functional testers. These are professionals specialized in manual testing, but having few technical skills. Their specialization makes functional testers good in 'testing' functionality. However, old school QA has a tendency (and often a commercial interest) to also use these testers to 'check' functionality.



Quote for the day:


"Goals allow you to control the direction of change in your favor." -- Brian Tracy


September 06, 2014

Your Database: The Threat That Lies Within
Unlike other software components and code or compiled code, a database is not a collection of files. It cannot just be copied and pasted from development to testing and to production, because it is a container of your most valued asset – your business data, which must be preserved. In most cases, database development is also performed in a very different way than application code (.Net or Java development), as developers and DBAs are accessing and changing a shared resource, a central database, rather than a local copy on their workstation.


Data Mining Reveals How Social Coding Succeeds (And Fails)
A social coding project begins when a group of developers outline a project and begin work on it. These are the “internal developers” and have the power to update the software in a process known as a “commit”. The number of commits is a measure of the activity on the project. External developers can follow the progress of the project by “starring” it, a form of bookmarking on GitHub. The number of stars is a measure of the project’s popularity. These external developers can also request changes, such as additional features and so on, in a process known as a pull request.


Data Breach and Spear Phishing
In the world of online, spear phishing is where a spammer leverages legitimate information to trick the recipient. Their bait can appear to be from a recognized person or company. Or you could get an email addressed to you asking you for additional information. If the sender can target the email to your needs, include personalization and grab your attention, they can trick you into doing a lot. Savvy spear phishers add a multi-channel twist incorporating calls, verifying your address (or where you bank, where you shop or kid’s schools), they send the promised follow-up email, incorporate letters – anything to get your attention.


The Innovation Dead End
You can certainly hire people who’ve never failed; their courage can have a buoying effect on everyone else — but they too will become risk averse over time as they encounter failure, so it’s not a lasting solution. You can (and should) make every effort to fail as fast as possible to minimize the human costs of failure. But that tactic is limited by how long it realistically takes to prove or disprove the kind of ideas you work on. Even ruthlessly optimizing project definition proof of concept, failing fast can still take months or years, especially if your innovation is technical, rather than product- or market-based.


Australian streaming services lock down content before Netflix
Speaking yesterday at the ASTRA 2014 conference in Sydney, Presto's director Shaun James said that Presto was on the offensive, rather than defensive in getting into the market now before Netflix arrives. "It's not defensive, we're playing offensive with Presto. Yes, there are some reasons for getting into that business and having first-mover advantage, and yes, we are using the benefits of being part of the Foxtel family, but it is very much an offensive. We're up and running, and we're going to be aggressive," he said.


Future of IT standards, SOA, and disruptive technologies stands strong
SOA has been established for a long time. It was declared dead at one point. In fact, the person who made that statement eventually had to recant and admit that it was not dead. From an Open Group perspective, we don't normally think that something that comes up with a bang like SOA may still be around 10 years later, but that seems to be the case. ... There is perhaps a change in emphasis on the techniques used under the heading of 'SOA,' but certainly there was a point at which it became unpopular to go to your CIO and say,


The Life and Times of TDD
A TDD approach can be used to specify the detailed design of your application code, database schema, or user interface (UI) in a JIT executable manner throughout construction. This is referred to as developer TDD or unit TDD and is typically done via xUnit tools just as jUnit for Java and PL/Unit for Oracle. Not surprisingly the survey found that TDD practitioners are commonly doing more than just TDD to explore their designs. People doing developer TDD were also working on teams who were applying other design related activities


Motivating the Negative Nancy on Your Team
A “Negative Nancy” is someone who overgeneralizes in labeling situations and people, focuses on the bad in each situation, jumps to conclusions and constantly redirects the blame. In a business setting, these behaviors can result in harmful effects, such as reduced productivity, decreased group morale, increased stress, wasted time, hindered creativity and innovation, and higher employee turnover. ... “Allowing [negativity] to fester is much more costly and damaging to an organization’s bottom line than confronting or possibly replacing a single toxic employee,” said president and CEO of Fierce Inc.


Berlin: A British Perspective on Germany’s Tech Hub
Contrary to the stereotype, Germans tend to be friendly, welcoming and warm people. A quarter of a century ago I drove around mainland Europe with two friends in a converted Bedford van. Scruffily dressed and culturally naive, we saw everyday life and prejudices in a dozen countries. The Dutch were fun, the French loathed us, the Germans went out of their way to be helpful. Statistically meaningless anecdotes, I know, but those German attitudes are certainly visible in Berlin. You don't really know what helpfulness is like until you've walked around a city at night trying to find a GP to prescribe antibiotics and painkillers for your daughter's ear infection.


Aligning People, Processes and Technology for Successful Data Governance
The legal and compliance world is continuously evolving, and every industry must understand how laws and regulations apply to them. Often regulations force companies to maintain data for a set period of time and, most importantly, search and produce this data when needed. To reduce litigation risk, legal generally reduces the amount of time that a company keeps data to the bare minimum. At the same time, there is also an increased burden for legal discovery (e-discovery). Companies are now required to be able to produce data related to a case in a reasonable amount of time.



Quote for the day:

"Work like you don't need the money. Love like you've never been hurt. Dance like nobody's watching." -- Satchel Paige