Showing posts with label case study. Show all posts
Showing posts with label case study. Show all posts

Daily Tech Digest - December 23, 2023

How LLMs made their way into the modern data stack in 2023

Beyond helping teams generate insights and answers from their data through text inputs, LLMs are also handling traditionally manual data management and the data efforts crucial to building a robust AI product. In May, Intelligent Data Management Cloud (IDMC) provider Informatica debuted Claire GPT, a multi-LLM-based conversational AI tool that allows users to discover, interact with and manage their IDMC data assets with natural language inputs. It handles multiple jobs within the IDMC platform, including data discovery, data pipeline creation and editing, metadata exploration, data quality and relationships exploration, and data quality rule generation. Then, to help teams build AI offerings, California-based Refuel AI provides a purpose-built large language model that helps with data labeling and enrichment tasks. A paper published in October 2023 also shows that LLMs can do a good job at removing noise from datasets, which is also a crucial step in building robust AI. Other areas in data engineering where LLMs can come into play are data integration and orchestration. 


Corporate governance in 2023: a year in review

2023 has seen a continuing trend of more responsibilities for directors. Often, this responsibility comes from regulators; sometimes, it comes from investors or other stakeholders. One thing is certain, though: directors are rapidly losing any remaining wiggle room to be “rubber-stamp” individuals. Modern board roles carry serious accountability; many directors are starting to appreciate that and adhere to new standards. The trouble is sometimes the new standard overstretch the director – so much so that we now have concerns about overboarding, exhaustion, and undue stress. How will that play out if the trend of more responsibility continues? ... The board dismissed the evidently popular CEO Sam Altman in a decision made behind closed doors with utmost secrecy. And as the world’s attention predictably turned their way, they could give no answers. Soon, Altman was rehired after around 70% of the company’s staff threatened to resign and join Microsoft (a significant OpenAI investor). The board subsequently agreed to undergo a major reshuffle for more accountability and transparent decision-making.


Quantum Computing’s Hard, Cold Reality Check

The problem isn’t just one of timescales. In May, Matthias Troyer, a technical fellow at Microsoft who leads the company’s quantum computing efforts, co-authored a paper in Communications of the ACM suggesting that the number of applications where quantum computers could provide a meaningful advantage was more limited than some might have you believe. “We found out over the last 10 years that many things that people have proposed don’t work,” he says. “And then we found some very simple reasons for that.” The main promise of quantum computing is the ability to solve problems far faster than classical computers, but exactly how much faster varies. There are two applications where quantum algorithms appear to provide an exponential speed up, says Troyer. One is factoring large numbers, which could make it possible to break the public key encryption the internet is built on. The other is simulating quantum systems, which could have applications in chemistry and materials science. Quantum algorithms have been proposed for a range of other problems including optimization, drug design, and fluid dynamics. 


Navigating the Data Landscape: The Crucial Role of Data Governance in Today’s Business Environment

Data quality management has become increasingly paramount as the volume of data exponentially raises day by day. Organizations can protect their data with policies and procedures, ensure that they follow all the rules and regulations, hire folks that understand the data you are collecting and what it means to their company but if that data isn’t high quality your organization may get the short end of the stick. Maybe you’re three weeks late for a TikTok trend or you miss out on a whole subset of customers because of the misstep with your collection methods, either way that profit loss and a chance to build on that data point in the future could be a pivotal misstep. Ensuring that your organization has processes to monitor and improve your data quality on a continuous basis will save your organization time and money in the long run. Despite its importance, implementing effective data governance comes with challenges. Organizations often face resistance to change, cultural barriers, and the complexity of managing diverse data sources.


Choosing Between Message Queues and Event Streams

There are numerous distinctions between technologies that allow you to implement event streaming and those that you can use for message queueing. To highlight them, I will compare Apache Kafka and RabbitMQ. I’ve chosen Kafka and RabbitMQ specifically because they are popular, widely used solutions providing rich capabilities that have been extensively battle-tested in production environments. ... Message queueing and event streaming can both be used in scenarios requiring decoupled, asynchronous communication between different parts of a system. For instance, in microservices architectures, both can power low-latency messaging between various components. However, going beyond messaging, event streaming and message queueing have distinct strengths and are best suited to different use cases. ... Message queueing is a good choice for many messaging use cases. It’s also an appealing proposition if you’re early in your event-driven journey; that’s because message queueing technologies are generally easier to deploy and manage than event streaming solutions. 


5G and edge computing: What they are and why you should care

Instead of relying solely on large, high-powered cell towers (as 4G does), 5G will run off both those towers and a ton of small cell sites that can be clustered together. This is how 5G achieves its population density. 5G is also supposed to be more energy efficient. As such, the communications component of IoT devices won't drain as much power, resulting in longer battery life for connected devices. There's also a ton of AI and machine learning in 5G implementations. 5G nodes and interface devices deployed on the edge, away from central hubs. They utilize AI and machine learning to analyze communications performance, and use AI to bandwidth-shape communications, to wring as much performance out of the hardware as possible. You're familiar with the term "cloud computing." We've all used cloud services, services that run on a server someplace rather than on our desktop computers or mobile devices. The cloud, of course, isn't really a cloud. Amazon, Google, Facebook, Microsoft, and others operate massive data centers packed with thousands upon thousands of servers. Soft and fluffy, the cloud is not.


Stolen Booking.com Credentials Fuel Social Engineering Scams

Social engineering expert Sharon Conheady said this type of trickery remains extremely difficult to repel, because of the customer-first nature of hospitality. Many public-facing people in such organizations, such as receptionists, are "trained to help people - that's their job," and of course they're going to bend over backwards to try to meet apparent customers' demands, Conheady said in an interview at this month's Black Hat Europe conference in London. Help desks remain another frequent target. "I had a client lately who asked me to call the help desk and obtain BitLocker keys," she said, referring to a recent penetration test. "Every single one of the help desk agents gave us the BitLocker key." That prompted her to ask: Do these personnel even know what a BitLocker key is, and why they shouldn't share it? The client said they didn't know. While training people in customer-facing roles can help, Conheady said the only truly effective approach would be to put in place strong technical controls to outright prevent and block such attacks.


Significantly Improving Security Posture: A CMMI Case Study

“Phoenix Defense has led the way in adopting CMMI Security best practices for nearly two decades, and now included the Security best practices,” says Kris Puthucode, Certified CMMI High Maturity Lead Appraiser at Software Quality Center LLC. “This adoption has yielded quantifiable benefits, enhancing security posture across Mission, Personnel, Physical, Process, and Cybersecurity domains. Additionally, incorporating Virtual work best practices has standardized virtual meetings and events, boosting efficiency.” Phoenix Defense has been a CMMI Performance Solutions Organization since 2005, first achieving Maturity Level 5 in 2020. ... Before adopting CMMI Security and Managing Security Threats and Vulnerabilities Practice Areas in the model, Phoenix Defense had a closed network with no outward-facing applications and relied on a third-party vendor to monitor threats and spam. They did not fully, quantitively track attacks against the networks or other data flows, and they required a more robust approach to properly ensure network security.


5 common data security pitfalls — and how to avoid them

While regulations like GDPR and SOX set standards for data security, they are merely starting points and should be considered table stakes for protecting data. Compliance should not be mistaken for complete data security, as robust security involves going beyond compliance checks. The fact is that many large data breaches have occurred in organizations that were fully compliant on paper. Moving beyond compliance requires actively identifying and mitigating risks rather than just ticking boxes during audits. ... Data is one of the most valuable assets for any organization. And yet, the question, “Who owns the data?” often leads to ambiguity within organizations. Clear delineation of data ownership and responsibility is crucial for effective data governance. Each team or employee must understand their role in protecting data to create a culture of security. ... Unpatched vulnerabilities are one of the easiest targets for cyber criminals. This means that organizations face significant risks when they can’t address public vulnerabilities quickly. Despite the availability of patches, many enterprises delay deployment for various reasons, which leaves sensitive data vulnerable.


Outmaneuvering AI: Cultivating Skills That Make Algorithms Scratch Their Head

Reasoning, the intellectual ninja of skills, is all about slicing through misinformation, assumptions, and biases to get to the heart of the matter. It’s not just drawing conclusions, but thinking about how we do that. This skill is the brain’s bouncer, keeping cognitive fallacies and hasty generalizations at bay. We humans, bless our hearts, are prone to jumping on the bandwagon or seeing patterns where there are none (like seeing a face on Mars or believing in hot streaks at Vegas). These mental shortcuts, or heuristics, can lead us astray, making reasoning not just useful but essential. AI is trained on our past reasoning reflected in old works. But it can’t reason on its own — at least not yet. Consider a business deciding whether to invest in a new technology. Without proper reasoning, they might follow the hype (everyone else is doing it!) or rely on gut feelings (it just feels right!). But with reasoning, they dissect the decision, weigh the evidence, consider alternatives, and make a choice that’s not just good on paper, but good in reality.



Quote for the day:

"Whether you think you can or you think you can’t, you’re right." -- Henry Ford

Daily Tech Digest - June 30, 2021

DigitalOcean aligns with MongoDB for managed database service

There is, of course, no shortage of DBaaS options these days. DigitalOcean is betting that its Managed MongoDB service will not only extend the appeal of its cloud service to developers, but also to SMBs that are looking for less costly alternatives to the three major cloud service providers, Cooks said. MongoDB already has a strong focus on developers who prefer to download an open source database to build their applications. In addition to not having to pay an upfront licensing fees, in many cases developers don’t need permission from a centralized IT function to download a database. However, once that application is deployed in a production environment, some person or entity will have to manage the database. That creates the need for the DBaaS platform from MongoDB that DigitalOcean is now reselling as an OEM partner, said Alan Chhabra, senior vice president for worldwide partners at MongoDB. The DigitalOcean Managed MongoDB service is an extension of an existing relationship between the two companies that takes managed database services to the next logical level, Chhabra asserted. “We have a long-standing relationship,” he said.


Digital transformation at SKF through data driven manufacturing approach using Azure Arc enabled SQL

As SKF looked for a solution that supported their data-driven manufacturing vision for the Factories of the Future, they wanted a solution that was able to support distributed innovation and development, high availability, scalability and ease of deployment. They wanted each of their factories to be able to collect, process, analyze data to make real-time decisions autonomously while being managed centrally. At the same time, they had constraints of data latency, data resiliency and data sovereignty for critical production systems that could not be compromised. The drivers behind adopting a hybrid cloud model came from factories having to meet customer performance requirements, many of which depend on ability to analyze and synthesize the data. Recently, the Data Analytics paradigms have shifted from Big Data Analysis in the cloud to more Data-Driven Manufacturing at the machine, production line and factory edge. Adopting cloud native operating models but in such capacity where they can execute workloads physically on-premises at their factories turned out to be the right choice for SKF.


A new dawn for enterprise automation – from long-term strategy to an operational imperative

To drive sustainable change, organisations need to take a large-scale, end-to-end strategic approach to implementing enterprise automation solutions. On one level, this is a vital step to avoid any future architecture problems. Businesses need to spend time assessing their technology needs and scoping out how technology can deliver value to their organisation. Take, for example, low code options like Drag and Drop tools. This in vogue technology is viewed by companies as an attractive, low-cost option to create intuitive interfaces for internal apps that gather employee data – as part of a broad automation architecture. The issue is lots of firms rush the process, failing to account for functionality problems that regularly occur when integrating into existing, often disparate systems. It is here where strategic planning comes into its own, ensuring firms take the time to get the UX to the high standard required, as well as identify how to deploy analytics or automation orchestration solutions to bridge these gaps, and successfully deliver automation. With this strategic mindset, there is a huge opportunity for businesses to use this thriving market for automation to empower more innovation from within the enterprise.


The Rise Of NFT Into An Emerging Digital Asset Class

The nature of NFTs being unique, irreplaceable, immutable, and non-fungible makes them an attractive asset for investors and creators alike. NFTs have empowered creators to monetize and value their digital content, be it music, videos, memes, or art on decentralized marketplaces, without having to go through the hassles that a modern-day creator typically goes through. NFTs, at their core, are digital assets representing real-world objects. ... NFTs solve the age-old problems that creators like you and I have always faced when protecting our intellectual property from being reproduced or distributed across the internet. The most popular standard for NFTs today are ERC-721 and ERC-1155. ERC-721 has been used in a majority of early NFTs until ERC-1155 was introduced. With that said, these token standards have laid the foundation for assets that are programmable and modifiable; therefore, setting the cornerstone for digital ownership leading to all sorts of revolutionary possibilities. The NFT ecosystem has found its way into various industries as more people join hands and dive deeper into its novel possibilities. 


Three Principles for Selecting Machine Learning Platforms

Of the challenges this company faced from its previous data management system, the most complex and risky was in data security and governance. The teams managing data access were Database Admins, familiar with table-based access. But the data scientists needed to export datasets from these governed tables to get data into modern ML tools. The security concerns and ambiguity from this disconnect resulted in months of delays whenever data scientists needed access to new data sources. These pain points led them towards selecting a more unified platform that allowed DS & ML tools to access data under the same governance model used by data engineers and database admins. Data scientists were able to load large datasets into Pandas and PySpark dataframes easily, and database admins could restrict data access based on user identity and prevent data exfiltration. ... A data platform must simplify collaboration between data engineering and DS & ML teams, beyond the mechanics of data access discussed in the previous section. Common barriers are caused by these two groups using disconnected platforms for compute and deployment, data processing and governance.


Introduction To AutoInt: Automatic Integration For Fast Neural Volume Rendering

AutoInt, also known as Automatic integration, is a modern image rendering library used for high volume rendering using deep neural networks. It is used to learn closed-form solutions to an image volume rendering equation, an integral equation that accumulates transmittance and emittance along rays to render an image. While conventional neural renderers require hundreds of samples along each ray to evaluate such integrals and require hundreds of costly forward passes through a network, AutoInt allows evaluating these integrals with far fewer forward passes. For training, it first instantiates the computational graph corresponding to the derivative of the coordinate-based network. The graph is then fitted to the signal to integrate. After optimization, it reassembles the graph to obtain a network that represents the antiderivative. Using the fundamental theorem of calculus enables the calculation of any definite integral in two evaluations of the network. By applying such an approach to neural image rendering, the tradeoff between rendering speed and image quality is improved on a greater scale, in turn improving render times by greater than 10× with a tradeoff of slightly reduced image quality.


How Google is Using Artificial Intelligence?

In the old times, we were much dependent on the paper map or the suggestions of people well-versed with the tracks of our destinations. But with that, the problem was we never reached on time to our spots. Now, you need not seek such suggestions from the people or a paper Map as Google Maps has solved the related difficulties. With territories and over 220 countries like Delhi, the United States, Pakistan, Australia, etcetera one can affordably reach the places already decided. You may curiously ask about the technology embedded and the answer for this is Artificial Intelligence. The main concept is global localization which is relying on AI. This helps Google Maps understand your current or futuristic orientation. Later, it lets the application precisely spot your longitudinal and latitudinal extent and as you or your vehicle proceed further, Google Maps starts localizing hundreds of trillions of street views. As you keep on traversing, the application announces a series of suggestions thereby helping you reach a shopping mall, airport, or other transit stations. Apart from this, you can prepare a list of places you will visit, set routing options as per your preferences, explore the Street View option in Live mode, and so on. 


What is edge computing and why does it matter?

There are as many different edge use cases as there are users – everyone’s arrangement will be different – but several industries have been particularly at the forefront of edge computing. Manufacturers and heavy industry use edge hardware as an enabler for delay-intolerant applications, keeping the processing power for things like automated coordination of heavy machinery on a factory floor close to where it’s needed. The edge also provides a way for those companies to integrate IoT applications like predictive maintenance close to the machines. Similarly, agricultural users can use edge computing as a collection layer for data from a wide range of connected devices, including soil and temperature sensors, combines and tractors, and more. The hardware required for different types of deployment will differ substantially. ... Connected agriculture users, by contrast, will still require a rugged edge device to cope with outdoor deployment, but the connectivity piece could look quite different – low-latency might still be a requirement for coordinating the movement of heavy equipment, but environmental sensors are likely to have both higher range and lower data requirements – an LP-WAN connection, Sigfox or the like could be the best choice there.


Artificial Intelligence (AI): 4 novel ways to build talent in-house

To discover the gems hidden across your organization, you must start maintaining a self-identified list of skills for every employee. The list must be updated every six months and be openly searchable by associates to make it useful and usable. Palmer recommends self-classifying each individual’s skills into four categories: expert, functioning, novice, and desired stretch assignment. This allows teams with hiring needs to scout for individuals with ready skills and those with growth aspirations in the five competencies needed for AI. Finding the right content to upskill your in-house teams is a challenge. Despite the rapid mushrooming of training portals and MOOCs (massive open online courses), the curriculums may not meet your organization’s specific needs. However, with access to such great content online, often for free, it may not make sense to recreate your content. “You must design your own curriculum by curating content from multiple online sources,” says Wendy Zhang, director of data governance and data strategy at Sallie Mae. Base the training plan on your team’s background, roles, and what they need to succeed. 


Solving Mysteries Faster With Observability

Let's start by looking at the sources that we turn to when we look for clues. We often begin with observability tooling. Logs, metrics, and traces are the three pillars of observability. Logs give a richly detailed view of an individual service and provide the service a chance to speak its own piece about what went right or what went wrong as it tried to execute its given task. Next, we have metrics. Metrics indicate how the system or subsets of the system, like services, are performing at a macro scale. Do you see a high error rate somewhere, perhaps in a particular service or region? Metrics give you a bird's eye view. Then we have traces, which follow individual requests through a system, illustrating the holistic ecosystem that our request passes through. In addition to observability tooling, we also turn to metadata. By metadata, I mean supplemental data that helps us build context. For us at Netflix, this might be, what movie or what show was a user trying to watch? What type of device were they using? Or details about the build number, their account preferences, or even what country they're watching from. Metadata helps add more color to the picture that we're trying to draw.



Quote for the day:

"A sense of humor is part of the art of leadership, of getting along with people, of getting things done." -- Dwight D. Eisenhower

Daily Tech Digest - May 02, 2018

Next Port of Call — Digitization of Automotive Retail

Image Attribute: Inside a car showroom / Source: Mercedes-Benz of Encino/Flickr
As per the Cox Automotive's survey, for every retail sale, customers visit the auto dealer only two to three times (at maximum), including to sign the contract and to take the custody of the vehicle. However, the consumers are also taking the unbeaten path - like - initiating the buying process online by “building a vehicle” to their specifications and then searching inventory in a specific geography. The buyer evaluates their current vehicle’s trade-in value based on its model, option content, age, and condition. The financial institution (either traditional bank or newer online lenders) reviews, selects and approves financing and the consumer’s choice of purchase or lease in real time. Then the purchase process shifts from digital to more traditional retail, when the consumer arrives at the dealership to test drive the vehicle and sign the necessary paperwork to take ownership. Some dealers, taking advantage of their close proximity to the customer, further emulate the new online purchasing model by delivering the vehicle directly to the customer’s home at no charge.


Resolving who actually owns security in agile development

As we know, the developers’ main focus is getting a working product out the door as fast as possible, while the security folks want to reduce the chances that the product will contain vulnerabilities. Ideally, the developers would be able to code without any interruption or inference from the security folks. However, since developers are only human, there will always be flaws in the code that they write themselves, as well as issues in the code that they take from third-parties like open source repositories from sources like GitHub. We know that it is cheaper in terms of time and money to catch and fix vulnerabilities early in the process rather than later, especially when your developers have built more features on top of imperfect code. Moreover, we see a bottleneck occurring when security issues are left unaddressed until a short while before release (when stress levels are particularly high).


Shifting a Corporate Culture at Scale — and with Speed


Speed was very important in decision making. The culture of the prior organization was to extensively “discuss and deliberate.” As an example, the first meeting I was at had 25 people. My first call had 100 people. People were coming into meetings who were not necessarily contributing but who were transcribing and communicating to others; they weren’t the people who were supposed to take the action. One of the first meetings I had on July 14 was a review of the business. I had a stack of paper on one side, a stack of paper on the other side. I said, “I’m going to make a policy decision: no more paper.” And of course, I got a call that evening, saying, “Hey, I don’t know if you are aware of the fact you work for Xerox.” And I said, “Oops.” I said no more paper because the idea is to quickly convert people from the previous approach. People are showing up, and they’re basically reading off the presentation. So we changed that. But organizational structure is the clearest way to inform you as to how successful you will be.


State of Cybersecurity 2018: Enterprises Can Do Better

It seems that over the past 12 months, security has slipped down the boardroom agenda. According to the survey results, only 20% of organizations have their security function reporting to the CEO or main board. This represents an even lower figure than the 24% from last year (although the question in the previous year was phrased slightly differently). Also, 57% of the practitioners surveyed believed that their main board was adequately supporting security initiatives, a 10% decrease from the 67% figure from the previous year. On the bright side, 64% of enterprises were expecting to increase their cybersecurity budget this year, which also means that in 36% of enterprises, the expectation is to make do with the same or less money on their security efforts. That is an improvement over last year (where only 50% of respondents expected a security budget increase) but still shows a degree of complacency or risk-optimism in a sizable number of organizations.


Rip and replace your RDBMS? No – build cloud apps instead.

man-changes
“Customer success” is not just a nice way to give a new name to services. It is very much a mindset and a model that says you have to really understand what your customer is trying to achieve." That advice also means architecture planning, tying DataStax into an array of tools and vendors, from the storage layer to the security layer to the middleware layer: “How we interact and engage with our partners is all very important.” So is this a revenue play for DataStax, or is it about solidifying the customer relationship and making sure the projects deliver? Bosworth says it’s very much the latter. Without opening the entire financial kimono, he offered this: "We don’t share a lot of financial information. One thing I can tell you is our gross margins run north of 75 percent – that’s our blended gross margin as a company. That’s really how you can figure out if a company is a services company or a software company. Certainly anything upwards of 70 percent puts you in the software category. … kind of time-to-impact if you will."


University of San Francisco GE Digital Transformation Case Study

“Improving the productivity of existing assets by even a single percentage point can generate significant benefits in the oil and gas sector (and in other sectors). “The average recovery rate of an oil well is 35%, meaning 65% of a well’s potential draw is left in the earth because available technology makes it too expensive,” explains Haynes-Gaspar. “If we can help raise that 35% to 36%, the world’s output will increase by 80 billion barrels — the equivalent of three years of global supply. The economic implications are huge.” GE bet big on the Industrial Internet. The company put sensors on all of their products including gas turbines, jet engines, and other machines; connecting them to the cloud; and analyzing the resulting flow of data. The goal: identify ways to improve machine productivity and reliability. And it didn’t take long for GE engineers to realize that they could find interesting and unique patterns in the data.


Car hackers find remotely exploitable vulnerabilities in Volkswagen and Audi vehicles

Car hackers find remotely exploitable vulnerabilities in VW, Audi cars
The researchers noted, “Based on our experience, it seems that cars which have been produced before are not automatically updated when being serviced at a dealer, thus are still vulnerable to the described attack.” I encourage you to read their research paper, which delves into their attack strategy and technical system details, but it does not fully disclose the details of the remotely exploitable vulnerability because that, they believe, would be “irresponsible.” The researchers said they want to protect future cars but ask, “What about the cars of today or cars that were shipped last week? They often don’t have the required capabilities (such as over-the-air updates) but will be on our roads for the next fifteen years. We believe they currently pose the real threat to their owners, having drive-by-wire technology in cars that are internet-connected without any way to reliably update the entire fleet at once.” The hacked car models were from 2015, so if you have an Audi or Volkswagen, then contact to your dealer and ask about a software update.


Collaboration with utilities seen as first step in growth of smart cities

Berst said cities can invest in becoming a smart city in small ways. From installing smart street lights to putting in solar rooftops and other distributed renewable energy sources, to providing residents with electric car charging stations, cities can not only provide a more environmentally friendly atmosphere, but also save money. Installing smart street lights, such as through the Urbanova initiative for example, can save a city millions in electricity costs. “Smart street lights have a pay-off of three years or less. It’s one of the lesser expensive on-ramps that can lead to a deeper collaboration,” Berst said. “While those trucks are there installing the LED street lights, why not have them snap in a communications network into that existing infrastructure while they are up there? Now, not only do you have smart street lights, but an entire communications network as well.”


A Quick Guide to Implementing ATDD


Collaboration is one of the core values of the Agile methodology. Once, as I was working on a large project, I noticed a lack of collaboration between developers, testers, and business-minded individuals; a lack of clarity in requirements; frequent requirements scope-creep; a lack of visibility in regards to the testing completed; and defects being identified late in the project lifecycle. Most importantly to me, though, was that no one had any idea about our automation framework, so all of the automation tests were written after the features were developed and ready for testing. ... As a result, I found Acceptance Test Driven Development (ATDD) as one of the approaches used to mitigate many of these issues. It is often used synonymously with Behavior Driven Development (BDD), Story Test Driven Development (SDD) and Specification By Example (SBE). The main distinction of ATDD, as opposed to other agile approaches, is its focus on making developers, testers, business people, product owners and other stakeholders collaborate as one unit and create a clear understanding of what needs to be implemented.


At Interop: Everyone Into the AI Pool

"Now is the time to proactively look for problems where you can apply this. Yes, I think it's that important," he said, adding that you could toss a dart at a company org chart and find an area that could benefit from AI. Helping to identify the problems to be solved, and the type of improvement -- be it a new product or service, or a process improvement -- that should result is where business leaders need to work with technologists and data scientists to match the goals with technology capabilities. Putting AI and machine learning into action is where David Karandish, founder and CEO of Ai Software, took over. There's been plenty of discussion about how to use intelligent assistants or agents in the corporate world, taking a step beyond the bots that have popped up on websites in recent years. Karandish introduced the audience to his company's "Jane", a chat-based assistant that answers questions for employees and customers when integrated with a client company's internal systems. It's in use at several client companies besides his own.



Quote for the day:


"Knowledge is like underwear. It is useful to have it, but not necessary to show it off." -- Bill Murray


Daily Tech Digest - July 18, 2017

Why automation isn’t everything in cybersecurity

Some new generation solutions are purely focused on AI and machine learning. The promise is you turn it on in your environment and after a few days of the system learning on its own, it will be able to detect all the bad stuff. However, these systems suffer from a fatal flaw: missing the business context, adaptability and explainability needed to be truly effective. What do human analysts know better than any system or, more importantly, any intruder? They know their own environment and the enterprise context, as well as having an intuition about how their system operates and what is normal versus what is questionable. Humans also adapt quickly to fast changing conditions and can always explain why they did something. On the other hand, humans cannot scale and could struggle with mistakes and inconsistencies. Machines, as we know, are exponentially faster and consistent.


NEC claims new vector processor speeds data processing 50-fold

The company said its vector processor, called the Aurora Vector Engine, leverages “sparse matrix” data structures to accelerate processor performance in executing machine learning tasks. Vector-based computers are basically supercomputers built specifically to handle large scientific and engineering calculations. Cray used to build them in previous decades before shifting to x86 processors.  It fell out of favor as x86 closed the performance gap, but NEC has a series of supercomputers called SX that really up the ante. Each CPU in the new generation, SX-ACE, can crank out 256 gigaFLOPs of performance and address 1TB of memory, which is pretty powerful.  NEC said it also developed middleware incorporating sparse matrix structures to simplify machine-learning tasks.


How To Create An Effective Business Continuity Plan

Because restoring IT is critical for most companies, numerous disaster recovery solutions are available. You can rely on IT to implement those solutions. But what about the rest of your business functions? Your company's future depends on your people and processes. ... "There's an increase in consumer and regulatory expectations for security today," says Lorraine O'Donnell, global head of business continuity at Experian. "Organizations must understand the processes within the business and the impact of the loss of these processes over time. These losses can be financial, legal, reputational and regulatory. The risk of having an organization's "license to operate" withdrawn by a regulator or having conditions applied (retrospectively or prospectively) can adversely affect market value and consumer confidence. Build your recovery strategy around the allowable downtime for these processes."


Amazon Alexa is so smart it's stupid

Today, Alexa skills are somewhat like obscure command line directives: “Alexa, ask the Magic 8-Ball if I’ll ever remember any of these skills.” Amazon has built intelligence into Alexa that makes it easy for me to use Amazon services (e.g., buy replacement air filters, play Audible books, etc.) but has left much of the skills territory to third-party developers. This would be awesome if, as mentioned, it were easier to uncover these skills. But wait, you say, there’s a website devoted to helping you find new and exciting Alexa skills. That’s correct. Not only to discover but then enable a new skill—Alexa skills nearly always require enablement and then a special set of voice commands to trigger them—you have to visit a website. It’s a voice interface that requires you to type into a desktop web interface. Kinda silly, don’t you think?


Who controls the marketing tech stack in 2017: The CIO or CMO?

In an earlier era, one simply had to go through the IT department to get the technology one needed that would actually work with the existing infrastructure, technology standards, and enterprise architecture. No longer. The cloud and especially software-as-a-service (SaaS), has changed this equation forever. Every IT department is now faced with the most formidable possible day-to-day competitor: The combined services inventory of the entire SaaS industry, along with all the available mobile and enterprise app stores. These new sources of marketing IT collectively represent to the CMO -- as marketing technology tracker Scott Brinker has noted in his terrific industry analysis -- a genuine explosion of new options, going from a mere 150 business-ready marketing apps in 2011 to over an astonishing 3,500 in 2016.


'Absolutely Necessary': How Blockchain Could Help Tech Giant Cisco Reboot

It turns out, not only is Cisco exploring how to distribute identity to simplify employee logins across more than 20 of the company's subsidiaries, but that Cisco's customers themselves may someday use the service to better audit the transactions of suppliers. According to Greenfield, many database standards still have difficulty recognizing that a subsidiary is actually part of a parent company, making it hard to track who conducted which transactions and under whose authority. "We wanted to create a blockchain ID use case that uses the different APIs across these different organizations, and internal applications to establish one identity for internal users," he said. "But also customers as well, where it’s going to be easier to perform analysis."


3 compliance considerations for containerized environments

Instead of going to an operations team to get an app up and running, developers often build and deploy it themselves This means that many of the traditional workflows that organizations used to check for compliance before deploying new systems may no longer be in the loop. For example, in the past your operations team may have been responsible for ensuring PCI compliance before your retail app was updated. In a model in which the dev team can push that upgrade directly to production themselves, that manual check adds friction and delays to the process, if it happens at all.  Rather than relying on manual interaction, organizations can benefit from tools that integrate directly with the workflow and stress efficiency and prevention, rather than manual tasks and reaction.


Painlessly Migrating to Java Jigsaw Modules - a Case Study

The feature you’ll hear most about in the context of Java 9 is Project Jigsaw, the introduction of modules to Java. There are lots of tutorials and articles on exactly what this is or how it works, this article will cover how you can migrate your existing code to use the new Java Platform Module System. Many developers are surprised to learn that they don’t have to add modularity to their own code in order to use Java 9. The encapsulation of internal APIs is probably one of the features that concerns developers when considering Java 9, but just because that part of Jigsaw may impact developers does not mean that developers need to fully embrace modularity in order to make use of Java 9. If you do wish to take advantage of the Java Platform Module System (JPMS), there are tools to help you, for example the jdeps dependency analyzer, the Java compiler and your own IDE.


The 5 Fundamentals Of Effective Cloud Management

“A big mistake that many companies make is that they treat, particularly public cloud service, as though it is cable service, where you use it every month and pay a bill at the end of the month,” says Dennis Smith, a Gartner analyst who tracks the cloud management space. “Many find they’re spending more money than they did before [using their on-premises service]. Public cloud providers aren’t going to tell you there are more efficient ways of using their services. You need to manage it similar to the way you’d manage on-premises infrastructure." CIOs need to learn to manage those cloud systems with regard to cost, capacity planning, security and other conditions. That need has spawned a modest but growing market for cloud management tools, which companies use to apply policy to as well as automate and orchestrate across public and private cloud services in a uniform way, according to Smith.


How to sell to the CIO

There is good news: IT sales teams who develop a proactive, personal approach to CIOs can get a permanent foot in the door. Yet there's no room for complacency once a contract is signed. Proactivity must also extend to ongoing account management, which can be a merry-go-round. CIOs suggest salespeople tend to move accounts regularly, often as an IT leader has got used to a manager and the individual in question has begun to understand the demands of the CIO and his or her business. "The churn risk is huge," says interim CIO and consultant Toby Clarke, who adds that consistency will be rewarded. "The companies I've brought products from tend to have longevity in their account management team. It shows me that they have faith in the stuff they're selling because they're still working for the company."



Quote for the day:


"The greatest danger in times of turbulence is not the turbulence; it is to act with yesterday's logic." -- Peter Drucker


Daily Tech Digest - July 02, 2017

What makes identity-driven security the new age firewall

Using a proprietary algorithm, Microsoft Advanced Threat Analytics works round the clock, continually grasping behaviour of organizational entities, such as users, devices, and resources, and helps customers adapt to the changing nature of cybersecurity attacks. In addition to this, the technology enhances threat and anomaly detection with the Microsoft Intelligent Security Graph, which is propelled by enormous amounts of datasets and machine learning in the cloud. “Identity is the new firewall. If you are taking a traditional end point/device protection approach then you are short changing your organization goals. It is critical to understand that the perimeter of IT includes users, apps across cloud and on premise, and most importantly data. Identity is what can help secure this perimeter,” says Rajiv Sodhi


Enabling IoT Ecosystems through Platform Interoperability

To enable interoperability for IoT platforms on the cloud, fog, or device level, the BIG IoT API offers a well-defined set of functionalities. Seven functionalities are crucial. The first is identity management to enable resource registration. The second is discovery of resources according to user-defined search criteria. The third is access to metadata and data (data pull as well as publish-and-subscribe for datastreams). The fourth is tasking to forward commands to things. The fifth is vocabulary management for semantic descriptions of concepts. The sixth is security management, including authentication, authorization, and key management. The seventh is charging that allows the monetization of assets through billing and payment mechanisms.


Artificial Neural Networks (ANN) Introduction

We recognize images and objects instantly, even if these images are presented in a form that is different from what we have seen before. We do this with the 80 billion neurons in our brain working together to transmit information. This remarkable system of neurons is also the inspiration behind a widely-used machine learning technique called Artificial Neural Networks (ANN). Some computers using this technique have even out-performed humans in recognizing images. ... An ANN model is trained by giving it examples of 10,000 handwritten digits, together with the correct digits they represent. This allows the ANN model to understand how the handwriting translates into actual digits. After the ANN model is trained, we can test how well the model performs by giving it 1,000 new handwritten digits without the correct answer.


Alibaba: Building a retail ecosystem on data science, machine learning, and cloud

The war in retail has long ago gone technological. Amazon is the poster child of this transition, paving the way first by taking its business online, then embracing the cloud and offering ever more advanced services for compute and storage to thirrd parties via Amazon Web Services (AWS). Amazon may be the undisputed leader both in terms of its market share in retail and its cloud offering, but that does not mean the competition just sits around watching. Alibaba, which some see as a Chinese counterpart of Amazon, is inspired by Amazon's success. However, its strategy both in retail and in cloud is diversified, with the two converging on one focal point: data science and machine learning (ML).


The Future is Imminent: 9 Design Trends for 2018

For those uneducated graphic designers in the audience, the term synesthesia refers to the perceptual condition of mixed sensation; a stimulus in one sensory modality (like hearing) involuntarily elicits a sensation or experience in another sensory modality (like smell). A person with synesthesia might hear a bird chirping and all of a sudden smell the scent of popcorn, or taste the flavor of mint, or feel the sensation of floating. ... A progress spectrum is a far more natural way of measuring “progress.” Instead of breaking up the user experience into unnatural, linear, paginated steps, a progress spectrum reflects the true experience of the user, one in which progress is experienced along a broad and continuous spectrum, where one event seamlessly flows into the next.


The Cyber-frauds

A mobile wallet works like an electronic prepaid card and can be used to pay for things ranging from grocery to rail tickets without the need to swipe the debit/credit card. All you have to do is to key in the username and password for logging in. The app can be loaded with money either through debit/credit card or net banking. The flip side is that these wallets mostly rely on the phone's locking system for security and don't ask for any PIN or password while the payment is being made. ... Rahul Gochhwal, co-founder of Trupay, says, "The biggest security issue is lack of second factor of authentication (password) while transacting. This makes them vulnerable to system-level breaches as transactions can be system generated by a hacker without a password. Thus, technically, a hacker can make thousands of fraudulent transactions simultaneously."


What every CIO Needs to Know About Cyber Resilience

Bohmayr & Türk, from the Boston Consulting Group, write that “cyber-resilience in an organization must extend beyond the technical IT domain to the domains of people, culture and processes. A company’s protective strategies and practices should apply to everything the company does — to every process on every level, across departments, units and borders, in order to foster an appropriately security-conscious culture.” ... The issue of board responsibility and oversight of cyber risks isn’t new. In 2015, the Cybersecurity Disclosure Act of 2015 bill was introduced in the US Senate. The bill would have required “public companies to disclose whether any board member has experience or expertise in cybersecurity, and to describe the nature of that background” and should no board director have cybersecurity expertise, to justify why such expertise was unnecessary.


Security in a silo – breaking down the barrier between CISOs & C-Suite

If you’ve been in the security industry for any length of time, and as a CISO I assume you have been, you’ve probably already seen and read such articles… but if you are still reading this, it probably means that you don’t feel like this is yet a reality within your organisation. So, with the shared understanding that we are both in agreement that this shift is past due, we can start to talk about building your strategy to make it happen. Before doing so, however, we need to acknowledge a sobering truth: People don’t care about security for the sake of security alone. What they care about is the result that a sound security strategy can provide and the impacts/risks associated with the lack of a sound security strategy. We’ll use this understanding to inform the methods that we use to engage the organisation and our board.


The Hard-Dollar Benefits of GRC Consolidation

The intention of implementing a single platform architecture must come from an IT vision for rationalizing applications in use across the company. Instead of supporting potentially hundreds of applications that each do only one thing, the GRC infrastructure, in time, should comprise one cohesive platform that supports many functions. GRC applications tend to span activities and departments within organizations. Groups across the company often independently manage activities such as risk assessments, audits, controls testing and third-party assessments. To do this, they make use of many individual solutions – some of which do the same thing, just in different organizational silos. For secure business management, this colossal set of single-use applications has to be visible, managed, supported and maintained. This is something that is both costly and time inefficient.


The Computest Story: The Transformation to an Agile Enterprise

Inspired by Henrik Kniberg & Anders Ivarsson's famous article on how Spotify scaled their development organization we decided to put multidisciplinary teams in the center, supported by a group of people outside the teams focusing on coaching and fulfilling company-wide responsibilities. As Figure 2 indicates, the major difference in the first transformation step was to integrate as many central functions in interdisciplinary teams as possible, to structure them by industries and to differentiate the leadership group in 'captains' and 'coaches'. Whereas the captains took over social leadership for the teams as well as responsibility for resource and account management, the coaches formed a group of thought leaders with a broad variety of subject matter expertise, responsible for both policies and solutions.



Quote for the day:


"Thinking is the hardest work there is, which is probably the reason so few engage in it." -- Henry Ford


February 20, 2016

Technical Architects With Dirty Hands

The really good architects I knew received genuine respect, authority, and often affection within the team. You could not imagine operating without their sage advice, and they were ready to pitch in when an especially nasty problem emerged. When I first met some IT architects, who lived outside the development teams, my first thought was, “Why would you ever isolate yourself that way?” The weakness of the architect external to the team are hardly surprising. Rather than earning respect, teams look at these outsiders with suspicion. Rather than wielding authority, architects often struggle to find leverage with teams. And affection? In many organizations, that’s a rare achievement for architects.


Stubbing, Mocking and Service Virtualization Differences

Service virtualization is a technique for decoupling a test suite from environmental dependencies that is growing in popularity. It is part of the broader category of "test doubles" that include mocks and stubs. There are many tools that support either service virtualization or two very similar approaches: stubbing and mocking. This article offers a comparison of the three approaches, their relative strengths and explores a number of real world situations to recommend which technique may offer the most benefit. The intent is to provide an understanding of all three techniques that enables to you to choose the best option between a mock, stub, or virtual service depending upon the specifics of your situation.


There are no Pit Stops In Enterprise Transformations really ?

While all of these sound easy to write about but when faced with the challenge of turning your enterprise around , you need a mindset, culture , appropriate tools, talented people who understand the nuances of the change , tools required and how to go about it. ... Unless people perceive it this way change is difficult to force it down their throats. Fluidity in all process and people boundaries such that people can reach out and interact with folks across their immediate process lines. This needs a mindset change which can happen when their is strong management support towards moving away from silos and encouraging decentralization. ... Usual mindset is look at time, money , resources and progress once you have finished your product or solution.


Intel Compute Stick (2016) Review

The hardware upgrades are the less noticeable changes, though they extend beyond just a processor change from last year’s Bay Trail Z3735F chip to a Cherry Trail Atom x5-Z8300 running at 1.44GHz. The combo Wi-Fi/Bluetooth wireless card is now an Intel dual-band Wireless-AC 7265 instead of last year’s Realtek RTL8723BS, which adds support for 802.11ac and Bluetooth 4.2. ... Externally, the Cherry Trail Compute Stick has gotten the much needed addition of a USB 3.0 port, which complements the microSD slot and USB 2.0 port that are carried over from last year. This upgrade alone makes this year’s Compute Stick a vast improvement over last year’s; as you’ll see below, the performance differences between the Bay Trail and Cherry Trail processors are so small you won’t feel them much


The Next Big Questions in Management

So if we want to know where management thinking is going, perhaps it is most useful to look for the questions that people are not quite yet asking. Some of them may be surfacing now, seemingly out of nowhere. They may seem impractical or irrelevant. They may come from out of left field (or distant center or right field, for that matter). But sooner or later, they will likely strike us as the self-evident questions that we should have been asking all along. As editor-in-chief of strategy+business since 2005, it has been my job — and pleasure — to forage for those kinds of inquiries. Here are four significant questions I think may preoccupy the person (or computer) who holds this post 20 years hence:


Will The Real Enterprise Private Cloud Please Stand Up

Survey after survey have reaffirmed that IT security continues to weigh heavily on the minds of global CIOs. As just one recent example, cybersecurity was cited as a top priority for the fourth straight year by respondents of the CSC Global CIO Survey. With innovation and agility as key CIO objectives, it should come as no surprise then that 80% of respondents to the 2014-2015 edition of that survey reported moderate-to-heavy investment in private cloud. All too often, however, private cloud is implemented as a thin layer of automation on top of traditional virtualization technologies and IT practices. And all too often, these projects struggle.


Ramco: IT's First-Aid for the Aviation Industry

For years, the aviation industry has understood that IT offers a way of optimizing operations. Yet, many airlines still suffer from the use of disparate point solutions, each generating a repository of redundant data, which often has no ‘meaning’. Surprisingly some operators even handle a few jobs manually and manage vast maintenance data in hundreds of excel spreadsheets! But with the kind of cut-throat competition that is there in the aviation industry today, using an excel sheet as a process optimization, productivity improvement tool will send you back to the stone ages. Offline models of track maintenance, managing safety and regulatory compliance manually using paper-work or spreadsheets, using disparate point solutions are all tools of the past century.


HSBC banks on biometrics with new voice and touch services

According to the bank, the new services will be available to up to 15 million customers, who will no longer have to go to the trouble of remembering passwords and answers to random security questions. As the sheer number of passwords that have to remembered these days is one of the main issues for consumers, this is certainly a positive step forward. Francesca McDonagh, HSBC UK’s head of retail banking and wealth management, said: “The launch of voice and touch ID makes it even quicker and easier for customers to access their bank account, using the most secure form of password technology – the body” and described the move as “the largest planned rollout of voice biometric security technology in the UK.”


Diagnosing Common Database Performance Hotspots in our Java Code

All too often, developers don’t feel the need to optimize pool size as they typically don’t do the requisite large scale load testing, nor do they know how many users will be expected to use that new feature, or what ramifications it implies, for parallel DB Access. Or perhaps the pool configuration “got lost” along the way from pre-prod to production deployment and then defaulted back to whatever the default is for your app server. Connection pool utilization can easily be monitored through JMX metrics. Every application server (Tomcat, JBoss, WebSphere, …) exposes these metrics, although some require you to explicitly enable this feature. The following shows the pool utilization of four WebLogic servers running in a cluster.


Are You a Transformational CIO?

The transformational IT organization, on the other hand, is very much in demand and there are very few IT leaders that understand it. There also are very few business folks, so folks outside of IT, that have that level of expectation of IT; that transformational expectation, that business-centric expectation. And the best way to think about a transformational IT org is they are a business organization first that happens to have responsibility for IT. So, a CIO that is very business-oriented as opposed to tech-oriented, that also happens to have responsibility for IT. Back to your fundamental question, I would argue that the traditional, not the transformational, but the traditional CIO and the traditional IT organization is very much in decline. The transformational IT organization and the transformational CIO is very much in demand and that trend is something that we are just starting to scratch the surface on.




Quote for the day:

"Leadership requires the courage to make decisions that will benefit the next generation." -- Alan Autry


September 15, 2015

Enterprise data architecture strategy and the big data lake

Data virtualization's use of defined semantic models to represent a converged view of original sources addresses both of the issues with accessing data in a data lake. Federating access to data in a data lake eliminates the need for users to rewrite their applications to include code to read the data from the data lake, reducing the need for data replication. Existing applications can target the semantic model, making the source of the data transparent to the consuming application. At the same time, data virtualization hides the complexity of schema-on-read by allowing each user to apply specific data normalization and transformation rules to the data to produce the "renderings" that are suited for each application use.


The new art of war: How trolls, hackers and spies are rewriting the rules of conflict

To put it another way: cyberwarfare models are maturing in the same way that other technologies mature. To take a more prosaic example, the evolution of cyberwarfare is a lot like the cycle e-commerce went through. There was a lot of initial excitement and investment from retailers in building separate e-commerce operations or businesses, but gradually these became not just a standard part of their operation but for many retailers the core of their business, just as cyberwarfare planning and strategy is gradually becoming a part of mainstream military planning. However that doesn't mean that all countries are taking the same approach to strategy or that they even agree on what should be included in the term cyberwarfare.


First Detailed Public Map of U.S. Internet Backbone Could Make It Stronger

Knowing the exact location of the most important Internet cables should help efforts to understand the possible effects of natural disasters or intentional attacks on the Internet, for example. Barford says he is also talking with researchers and people at telecommunications companies about the idea of adding extra fiber links that would be shared by different companies. They’d be located at key points where new fiber between major population centers could significantly improve the resilience and efficiency of the Internet. Although the Internet is publicly accessible, it is woven together from many privately owned networks that interoperate. Telecommunications companies sometimes show schematics of their core networks, but without much geographic detail.


How new data-collection technology might change office culture

The obvious fear for many employees is that data collected would not be anonymous and, instead, could be used for hiring, firing and promotion considerations. The growing market for these types of tools is sure to spawn imitators who might not uphold the same privacy safeguards. Privacy advocates shuddered when a software developer recently boasted that it would be possible for employers to peek into the emails and messages sent through Microsoft's Lync messaging system. "You can become your own mini-NSA," David Tucker, CEO of Australian-based Event Zero, told Network World. Managers could see which employees are dating and which ones are seeking out their next job. "Just make sure it doesn't end up on WikiLeaks," he advised.


Case study: Philips takes agile approach to building bridges between business and IT

“Over a longer period, it’s easier to miss a few edges. The financial impact is also much greater as you need a lot of management to keep everything on track in a six- to nine-month project,” says van Zoelen. “The amount of code we throw away is limited so we save money. I would almost say everything we do now is focused on delivering the most value possible.” For this reason, and since throwing its weight behind agile in 2011, the company claims to have made savings in the region of €47m as project lead times have fallen from 54 business days to 20. Over this same period of time, the number of teams involved has also grown from seven to 120. Within the teams are high levels of engagement and – because everyone is clear about what they should be doing – the working environment is largely positive, says van Zoelen.


RoboEthics – We Need Universal Robot Rights, Ethics And Legislation

Is it ok to torture or murder a robot? We form such strong emotional bonds with machines that people can’t be cruel to them even though they know they are not alive. So should robots have rights? Mistreating certain kinds of robots could soon become unacceptable in the eyes of society. In what circumstance would it be OK to torture or murder a robot? And what would it take to make you think twice before being cruel to a machine? ... There is a new emerging technology called quantitative legal prediction. It turns out that experienced lawyers often add a lot of value by making predictions. Using big data, complex analytics, robots will be best at “predicting” if you’re going to win a case, or that the case will be overturned on appeal, for example.


Can training transform CISOs into business leaders?

“If you look at other C-suite roles – CEO, CFO, CMO – these have been established for decades, creating defined paths to success. The CISO has been around for roughly 10 to 15 years, but it didn’t come to prominence until the last few years, and then as a technical role.” And technical skills, he added, while key to the “functional” success of a CISO, “do not lend themselves well to the business acumen and communication skills needed to work with your typical C-suite today. The main shift needed is towards thinking in terms of risk, not technology, and how this risk relates to various aspects of the business.” Christiansen agrees, to the point that he said the job is getting a different title. “The role of the CISO is evolving to the chief information risk officer (CIRO),” he said.


Don't underestimate the network's importance in manufacturing analytics and IoT

The practice on manufacturing floors was to leave choices about networking topology and machine-to-machine (M2M) interconnections to vendors, but as this dialogue moves into internal ERP and other higher-level office systems that support analytics and dashboards, corporate IT will be involved. There are two flavors of Internet of Things (IoT) communications in manufacturing environments: an IP-based network that is hard-wired and that interconnects machines on the floor with the ability to move information to the internet; and a more localized communications scheme where devices in immediate proximity to each other communicate through wireless technology like Bluetooth or over wired Ethernet.



Why Agile Didn’t Work

On the top is the lofty goal of “satisfying customers by satisfying their constant changing requirements”. We achieve this goal by “delivering working software frequently”. To deliver working software, though, requires significant technical and managerial support. Ensuring that changing requirements do not break the system and slow down development is foremost a technical issue: how to design the system in a way that is flexible and how to create automation that ensures changes do not break things. To foster advanced technical skills in teams, teams have to be motivated to learn from their mistakes and to develop themselves.



Shadow IT risks heightened in hybrid cloud

Shadow IT risks are heightened when combined with hybrid cloud. Most companies have data security and compliance practices to protect not only their own information, but that of their customers and suppliers. These practices and policies assume that data is contained within a controlled environment. But if users create a hybrid cloud workflow that connects shadow IT software as a service (SaaS) applications to highly structured applications, they can violate security and governance requirements – a risk known as bandit hybridization. The dangers of bandit hybridization are growing for two reasons. First, SaaS adoption is increasing, and line departments can easily adopt SaaS applications without IT support.



Quote for the day:

"Nothing so conclusively proves a man's ability to lead others as what he does from day to day to lead himself." -- Thomas J. Watson