November 19, 2014

The Resurrection of Product Risk Analysis
The risk management described here aims at identifying risks connected with the development and implementation of an information system. The risk is the probability that development and implementation will cause measurable economic damage to the company, and perhaps also will cause other, less measurable damage. Damage could be that the results of a project are less favourable than expected or that the organisation will suffer direct or indirect loss. Risks extend beyond IT and concern the business, too.


The Web Isn't Dying, But Control Is
Apple's control of content in iOS apps is too much. If Apple wishes to continue to censor apps based on content, it should license third-party iOS stores or at least adopt Google Play's less fussy content policy. There's no reason native apps should be treated any differently from books or films when it comes to lawful content. There's no reason Apple should be able to reject an app with lawful content. With web apps, this isn't an issue; no permission is required to publish a web app. The issue isn't so much that the web is dying; it's that too many people prefer autocratic convenience over the web's messy democracy.


When will this madness stop?
Designing, building and launching digital products and services usually involves multiple areas of the organisation working together to create something new and innovative. Being a digital business means being a joined-up business. Digital does not stop at functional boundaries; it flows through the organisation to create integrated offerings and a seamless customer experience. A business with silos, whether organisational, data, systems or any other type, will struggle to survive in the digital age. Just because marketing is spending an increasing amount on technology does not mean there is a need for a marketing technology strategy.


IT needs to stop pretending it's not responsible for cloud security
So whole IT departments will use public cloud for their own work, but refuse to update perimeter security or network monitoring enough to let them see web apps, let alone encrypt that traffic and possible secure them? Who is supposed to do that, if not IT? Seventy-nine percent of IT people polled by Forrester in May of 2014 said end users should be primarily responsible for securing data in the cloud. That doesn't mean IT thinks users are responsible; no one in IT thinks users are responsible.


Cybercrime and spam are far bigger security threats than you think
"There are very few types of cybercrime that exist in a vacuum. Most forms of cybercrime are in some way connected to others. For example, nobody runs a botnet or robs bank accounts without taking steps to hide their true internet address. Usually to do that they are using hacked computers to route their traffic through, they are probably using hacked servers to store the stolen data, and then they are using money laundering networks to cash out transactions."


Keys To Collaborating Over A Business Network
The real potential for transformation comes from the ability of a business network to enable trading partner collaboration not just for invoice processing, but also for management of related documents such as catalogs, contracts, purchase orders, order confirmations, change orders, service entry sheets, freight line items, advance ship notices, payment status, and payment remittance. This means that, from one platform, in the cloud, you can streamline essential collaborative business processes from procurement through payment. At the same time, you can improve compliance by driving more orders off catalogs and simplifying the matching of invoices to purchase orders, contracts, and service entry sheets.


7 Important Tech Regulatory Issues In 2015
The Internet is now a central engine of society and must allow for continued innovation and development, Robert Atkinson, president of the Information Technology and Innovation Foundation (ITIF), told InformationWeek. "To this end, net neutrality rules should be tailored to allow for a case-by-case approach to prioritization, and network management that allows for some subtlety and nuance in regulation. This is to be preferred to an over-broad, proscriptive rule grounded in Title II regulation for the telephone era that would likely limit the Internet's potential to become the multi-purpose platform it promises to be," he said.


Overcoming Hurdles to Integrating Analytics with Operational Processes
Acceptable speed-of-response rates differ between operational and analytical use cases. Often operational processes require some real-time processing. Think of going to the grocery store or ordering a product or service, you expect this particular process to take place in real time. When you use an online or mobile application, your tolerance for slow response actually goes down. With the ability to change providers/applications quickly, it is important to match the expected speed of response with the performance of the application. This means that if you integrate analytics directly into operational processes, the speed of analytical response needs to match the real-time nature of most operational processes.


A Preview of C# 6
Mads Torgersen, C# program manager at Microsoft, published a short video presentationdescribing what is coming in the next major C# version, C# 6. Among C# 6 new features, Mads highlighted getter-only properties, the lambda-arrow operator, string interpolation, and more. First off, says Mads, C# 6 will not change C# design philosophy and will mostly provide many small features that will help clean up code.


Time for Data-Driven Intuition
The book The House Advantage: Playing the Odds to Win Big In Business (Jeffrey Ma) should be required reading for anyone working in the data management and business intelligence fields where we often oversimplify the business decision-making process by saying it’s either data-driven or intuition-driven—and strongly emphasizing that using data is always better than using intuition. Although Ma is definitely an advocate for data-driven decision making, toward the end of his book he also acknowledges that there are times when somewhat of a middle ground between data and intuition is called for.



Quote for the day:

"Consider spending more time setting the conditions for things to go right than dealing with things that go wrong." -- @ShawnUpChurch

November 18, 2014

CIO interview: Anna Barsby, CIO, Halfords
“There’s complexity running a programme and upgrading SAP, which is pretty much at the heart of our system estate,” she says. “And it was our first move into the cloud at the same time,” she adds. “Going into the cloud with anything has its unknowns, but with SAP as our first foray it just felt risky.” “Culture was a really big one for us,” she says. “Once agreed on HP, we decided we also wanted to move from a physical server to the cloud.” Barsby says while the move to the cloud increased risk, the retailer only needed one period of downtime to complete the upgrade.


Testing Strategies in a Microservice Architecture
There has been a shift in service based architectures over the last few years towards smaller, more focussed "micro" services. There are many benefits with this approach such as the ability to independently deploy, scale and maintain each component and parallelize development across multiple teams. However, once these additional network partitions have been introduced, the testing strategies that applied for monolithic in process applications need to be reconsidered. Here, we plan to discuss a number of approaches for managing the additional testing complexity of multiple independently deployable components as well as how to have tests and the application remain correct despite having multiple teams each acting as guardians for different services.


Mega Data Breaches: Are They Here to Stay?
Current security solutions either do not have the capabilities to aggregate, analyze and correlate information from multiple sources, or cannot scale and handle the volume of data generated by the activities over a period of time. The greatest area of unmet need with conventional security solutions is effective, targeted attack prevention and breach detection. Organizations are failing at early breach detection, with more than 92% of breaches detected and notified by a third party—this is what ultimately impacts the size and cost of the data breach.


Cloud computing's not-so-secret mission
As the cloud matures, we are seeing another layer of cloud computing that promises to shake the foundation of our IT infrastructure to its core – the advent of IT-as-a-Service, which will be perhaps the cloud’s highest calling. Initially, many thought of the cloud as the successor to the web host. The next-gen data center. As someone who first became involved in web hosting in 1995 or so, I will admit that I thought that as well. The cloud is a great place to keep your web infrastructure, and it is even great to keep your apps and app infrastructure. However, the cloud is also a great place to which you can move your entire IT infrastructure as well. It took a little longer than moving websites or even apps to the cloud, but IT in the cloud has arrived.


9 Healthcare Innovations Driven By Open Data
Vinod Khosla, a leading tech venture capitalist and the former CEO of Sun Microsystems, sees the change as inevitable. He described his vision in a keynote at this past June's Health Datapalooza, an annual celebration of new developments in data-driven healthcare. Khosla predicted that "data science will do more for medicine than all the biological sciences combined" over the next two decades. One driver, he believes, will be the need to reduce medical errors by using computers for more accurate case monitoring than humans can accomplish. These new advances are made possible by two related categories of data: big data and open data.


Five winning strategies of successful CIOs
Whether CIOs are being asked to deliver or transform, Marks says they will always have to consider a digital element. Data centres, he says, are being transformed, while mobility has become crucial and software is being delivered as a service by default. "The new digital value lies in the CIO's ability to match the best combination of technologies and to negotiate the right deal for all parties,” says Marks. “Whether the CIOs of today have the experience, skills, and motivation to achieve this combination is a different matter. This is perhaps the more daunting challenge for the CIO than the march, and possibly passing trend, of the chief digital officer.”


Determining data value to reduce cloud storage risks
The value of data deals with the utility of data. Data utility requires evaluation for the value of the content in the present, along with the potential value of that same data content in the future. A useful analogy might be to consider an old photograph taken of a subject in his younger days and showing him wearing the styles of that era. At the time the picture was taken, the image provided no offense to the subject. However, the same picture many years later might cause the subject to cringe at the fashion it displays. Now consider that instead of an old, funny picture, business or personal data is on display.


CIO success is all about winning friends and influencing people
The general consensus is that these pillars of technology are last year’s news, because CIOs today should be thinking about the concepts and technologies that sound a bit left field – such as how 3D printing and the internet of things (IoT) could influence the organisations they work in. One of the keynote sessions at this year's Gartner Symposium in Barcelona was a "fire-side" chat with Oliver Bussman, CIO of UBS. During the interview, Bussman was asked about the challenges facing the banking sector. "Digital disruption has arrived in banking," he said.


More users will hire criminals to fight cyber crime
The idea of using the skills of people that were once on the wrong side of the law is one that is taking hold in a rising number of companies, according to findings from KPMG. The firm found that over half of UK firms would consider hiring a hacker or someone with a criminal record in order to improve their own defences and stay ahead of the criminals. The reason why many would recruit former criminals is because the overwhelming number (74%) recognise there is a growing cyber threat and they are struggling, in the cases of 57%, to get hold of specialised staff and then keep them.


Cisco hands over security analytics framework to open source development
Announced in a blog post on Monday, the San Jose, CA-based company said Opensoc, a framework that uses big data analytics to detect threats, is now available for businesses to integrate within their own systems. ... The OpenSOC framework integrates elements of the Hadoop ecosystem, including Storm, Kafka, and Elasticsearch. According to the firm, this means OpenSOC is capable of full-packet capture indexing, storage, data enrichment, stream processing, batch processing, real-time search, and telemetry aggregation, and also provides a platform that can "effectively enable security analysts to rapidly detect and respond to advanced security threats."



Quote for the day:

"Too many people overvalue what they are not and undervalue what they are." -- Malcolm Forbes

November 17, 2014

13 Things to Do When a Hacker Steals Company Data
If the worst happens--e.g., a hacker steals your customer records or breaks into a server--it's easy to go into a tailspin and try solve every problem all at once. Apart from the headaches this can cause, it's also not the best approach to a data breach. Orlando Scott-Cowley, the director of technology marketing at Mimecast, a company that makes a secure cloud-based e-mail service, told me about an action plan he advises.


HP Analytics blazes new trails in examining business trends from myriad data
There are 20 million SMBs in US, and we are able to build a model to predict which of these prospects are similar to the clusters we had. That’s where we were able to find customers that looked like our most profitable customers, which we ended up callingVanguards. That resulted into a tremendous amount of a dollar increment for HP. It's a good example of what you talked when you find unexpected things. We just wanted to analyze data. It led us to a journey and ended up finding a customer group we weren't even aware of. Then, we could build marketing strategy to actually go target those and get some value out of it.


James Lewis on Microservices
Johannes Thönes talks to James Lewis, principal consultant at ThoughtWorks, about microservices. They discuss microservices’ recent popularity, architectural styles, deployment, size, technical decisions, and consumer-driven contracts. They also compare microservices to service-oriented architecture and wrap up the episode by talking about key figures in the microservice community and standing on the shoulders of giants.


ArchiMate 2.1® Poster Pack - Print Version
The ArchiMate meta-model and notation is fast becoming the de facto standard for depicting Enterprise Architecture. The ArchiMate® 2.1 Poster pack provides a quick-glance reference to both ArchiMate Concepts and ArchiMate Viewpoints.


A Primer on Measuring Employee Engagement
There are many factors that contribute to employee engagement — ranging from corporate culture to management style to competing priorities outside of work — and the pertinent factors are different for each employee. This complexity is what makes it so challenging to measure and understand engagement in an actionable way. While still in its infancy, people analytics is beginning to give organizations the data and tools to understand what drives engagement, perhaps even better than employees understand themselves.


Fitbit Data Now Being Used In The Courtroom
The lawyers aren’t using Fitbit’s data directly, but pumping it through analytics platform Vivametrica, which uses public research to compare a person’s activity data with that of the general population. Muller says the case is “unique,” and does appear to be the first known case where data from a wearable is used in court. (If other earlier cases come to light I will update this post.) “Till now we’ve always had to rely on clinical interpretation,” Muller says from his office in Calgary. “Now we’re looking at longer periods of time though the course of a day, and we have hard data.” His plaintiff will share her Fitbit data with Vivametrica for several months as part of an assessment period.


Are Asean CFOs starting to embrace the Cloud? Oracle asks
Despite the apparent advantages of Cloud computing, not all organisations are convinced that this is the best way forward as many CFOs still have their reservations about the quality of software vendors, and the possible creation of processing silos. Especially, the migration of ERP applications onto the Cloud is facilitating one of the biggest shifts in financial systems. The challenge of integrating systems and technologies remains a key barrier to adoption at many organisations; as well as the question of whether there are sufficient internal skills to make the shift.


Laser-Radio Links Upgrade the Internet
Technology that uses parallel radio and laser links to move data through the air at high speeds, in wireless hops of up to 10 kilometers at a time, is in trials with three of the largest U.S. Internet carriers. It is also being rolled out by one telecommunications provider in Mexico, and is helping build out the Internet infrastructure of Nigeria, a country that was connected to a new high-capacity submarine cable from Europe last year. AOptix, the company behind the technology, pitches it as a cheaper and more practical alternative to laying new fiber optic cables. Efforts to dig trenches to install fiber in urban areas face significant bureaucratic and physical challenges.


Data science: 'Machines do analytics. Humans do analysis'
Humans have to find the patterns, ask the right questions and make the connections in the data. "Machines do analytics," explained Sullivan. "Humans do analysis." Computers are good at detail and examining the past, but real data science requires imagination and cognitive ability. "I can take 10 tools, U.S. Census data and agriculture data and determine that people who were strangled by their bed sheets tracks cheese consumption," Sullivan said. "A human knows that makes no sense. You can't commoditize reasoning by a human." Another way to put it is that machines are used as "data janitors" to clean data and crunch numbers, but it's a small part of the overall process.


As open source goes mainstream, institutions collaborate differently
"There's a clear progression that nearly every government agency goes through, fromconsuming open source, to publishing open source (as a one-way broadcast), to collaboratingon open source," said Balter. "A similar progression is also seen from open source, to open data, and open government policy. Policymakers see the geek's tooling, realize the value of collaboration, and want to bring it into their own workflow. If your doctor takes a multivitamin every day, wouldn't you? To me, the idea of working more openly, regardless of format or form, within an organization, or with the public is the idea that we're seeing catch on. It's starting with open source, but that's just the start."



Quote for the day:

"Sometimes when you innovate,you make mistakes. It is best to admit them quickly,and get on with improving your other innovations." -- Steve Jobs

November 16, 2014

How to Become a Data Scientist in 8 Easy Steps
Our friends over at DataCamp just came out with a cool new infographic entitled “Become a Data Scientist in 8 easy steps.” This hits home to a lot of people who are trying to enter this new industry hoping to satisfy a lot of unfilled job openings. The question is how best to make this transition. The useful infographic below will help answer this question by outlining the process of becoming a data scientist ... These are all excellent tips, so examine the infographic carefully for more detail. You too can become part of the “sexiest job of the 21st Century!”


Search for Growth in Social, Mobile Fuels Tech M&A Boom
“Now it’s disruptive technology that’s in the crosshairs,” Liu said. “Consolidation involves coporations needing to catch up in a way that they are not able to do fast enough orignaically.” The aggregate global value of all publicly disclosed-value deals set a new post-dotcom era quarterly high of US$73.7 billion [b], up 41 percent sequentially and 4 percent year over year. At 923 deals in total, overall volume also set a record for any quarter since 2000, rising 6 percent sequentially and 31 percent year over year. Corporations, as opposed to private equity deals, continue to drive the growth, increasing aggregate value 40 percent sequentially and 9 percent year over year to $65.3 billion.


IoT Won’t Work Without Artificial Intelligence
The big problem will be finding ways to analyze the deluge of performance data and information that all these devices create. If you’ve ever tried to find insight in terabytes of machine data, you know how hard this can be. It’s simply impossible for humans to review and understand all of this data – and doing so with traditional methods, even if you cut down the sample size, simply takes too much time. We need to improve the speed and accuracy of big data analysis in order for IoT to live up to its promise.


What Every Business Owner Needs to Know About Data Sovereignty
Unfortunately, the laws and regulations protecting digital information can be extremely complex. They are dependent on different governments and jurisdictions, and data stored in certain countries may or may not be subject to subpoena by another country’s government. As an IT professional, you’re likely responsible for ensuring that your company’s data is fully protected. However, you need to provide your business’s owner with the basics to enable him or her to make the best decisions for the company — and the valuable data it possesses. For those who don’t work with technology all day, however, the variables can be overwhelming.


Collective intelligence, big data and IEML
There are two big problems with this landscape: The first is related to the methodology; today we use mainly statistical methods and logical methods. It is very difficult to have a semantic analysis of the data, because we do not have a semantic code, and let’s remember that every thing we analyze is coded before we analyze it. ... So you need a semantic code to have a semantic analysis. We do not have it yet, but I think that IEML will be that code. The second problem is the fact that this analysis of data is currently in the hands of very powerful or rich players –big governments, big companies. It is expensive and it is not easy to do –you need to learn how to code, you need to learn how to read statistics, is not easy.


MSSP: Integrate, NOT Outsource!
This means that for the MSSP to work well for you, process integration must be carefully planned. Here we talked about the alert response integration (and here about the SLAs), but the same applies to device management (integrate with your change management and reporting),incident response (integrate with your IR) and many other processes. This also means that this focus on integration allows you to vary the degree of security ‘outsourcing’ or externalization. If your plan – monitor – triage – respond – refine chain is well planned, you can almost painlessly engage external resources (MSSP, consultants, etc) at whatever stage: need more help with cleaning the mess? Call that IR consultant. Want to shift some perimeter monitoring duties outside? Go get that MSSP.


Requirements Discovery and Constraints Analysis
The process of requirements discovery broadly involves elicitation of functional and non-functional requirements from business needs. A business or enterprise architect’s role in requirements discovery is wider and broader in terms of scope, responsibility and, nature and stage of engagement. ... The nature of business concerns will not be limited to problems addressable by technology solution but also include considerations such as investments, ROI (Return on Investments), business case, timelines, priorities, risks and solution strategies potentially involving an eco-system of internal and external stakeholders (e.g. technology providers).


Simulation-Based Embedded Agile Development
While simulations containing embedded software need not be developed in an agile manner, Scrum’s agile framework helps realize greater benefits from a SiS approach. One Scrum event is the sprint review, in which the development team demonstrates what was accomplished during the sprint. It can be challenging to have something visual to demonstrate with embedded software development as there is often little to “see.” We might get only a blinking light or a wiggling fin. ... When such feedback is used in the sprint review as well as daily collaboration, these collective learning opportunities allow more nimble responses to necessary changes in requirements and design.


BlazeMeter, New Relic Team Up To Deliver Richer App Performance Testing Analytics
“Data analysis is most valuable when you can understand and act upon it instantly. Testing makes it easy to trigger a symptom, but you need monitoring to identify the root problem in the first place,” Girmonsky told IDN. “Together, BlazeMeter and New Relic provide their customers a full 360-degree view of their systems. Customers can dynamically define the KPIs they want to analyze, query the application and instantly understand the specific quirks of their system,” he added. The growing BlazeMeter/New Relic partnership is also a sign of how IT is increasing its use of machine data and big data to improve their software lifecycle -- design, development, testing and operations.


Optimizing Enterprise Risk for Value Creation
With IT risk being a subset of Enterprise risk, and given the pervasiveness of technology within the business, optimizing IT risk has a direct and positive effect on the overall risk of the organization. So important is risk optimization of the Enterprise’s IT to the organization that within COBIT 5 there is not one, but two, dedicated processes - ‘Ensure Risk Optimization’and ‘Manage Risk’.  The Ensure Risk Optimization process is within the Governance area of the COBIT 5 framework and is supported by 3 governance practices and 16 activities. The process ensures that the enterprise’s risk appetite and tolerance are understood and not exceeded by Enterprise IT, the impact of IT risk to enterprise value is identified and managed, and the potential for compliance failures is minimized.



Quote for the day:

"Take the first step in faith. You don't have to see the whole staircase, just take the first step." -- Martin Luther King Jr.

November 15, 2014

5 Hadoop Security Projects
While other projects attempt to improve Hadoop’s security from the inside, Apache Knox Gateway tries to do it from the outside. Apache Knox Gateway creates a security perimeter between Hadoop and the rest of the world by providing a REST API gateway for interacting with Hadoop clusters. All communication with Hadoop is done via Knox Gateway, which controls and moderates it. Knox includes the following features: LDAP and Active Directory integration, support for identity federation based on HTTP headers, and service-level authorization and auditing.


Amazon Phishing Attacks Pick Up for Holiday Shopping Season
"If you get an email with a Word attachment, don't open it, just go to the site, log into your account, and all the transaction history is right there readily available." he said. "It's always a good idea to go right to the horse's mouth." So far this month, AppRiver has quarantined more than 600,000 email messages with the subject line "Your Amazon Order Has Dispatched (#3digits-7digits-7digits)" and a return address of "amazon.co.uk." The attached Word document has a macro that installs a Trojan dropper that creates a process named "SUVCKSGZTGK.exe" and the dropper then installs a keylogger that harvests banking information, email logins, and social media accounts.


ETH Researchers Develop a Thought-Controlled Genetic Interface
Using the interface they designed, the ETH team showed a human volunteer wearing an EEG cap could use his thoughts to trigger production of a particular protein, called SEAP, in human kidney cells growing in a petri dish. He could also turn on supplies of the cells that had been implanted under the skin of lab mice. The research is interesting because it shows how futuristic brain implants might function, Folcher and company write in this week’s Nature Communications. Such devices, the ETH authors speculate, might sense a person’s feelings of pain (or perhaps oncoming epileptic seizure) and then automatically trigger brain cells to pump out a helpful biotech drug.


Facebook nudges users to take control with privacy makeover
"Over the past year, we've introduced new features and controls to help you get more out of Facebook, and listened to people who have asked us to better explain how we get and use information," wrote Erin Egan, Facebook's chief privacy officer. "Protecting people's information and providing meaningful privacy controls are at the core of everything we do, and we believe today's announcement is an important step." Facebook has had its share of privacy controversies. It has repeatedly been criticized for its privacy policies and even for the difficulty in using privacy controls.


Why bug bounty hunters love the thrill of the chase
“Having a look at the security community, we can tell that there are a lot of top-notch bug hunters who fulfill nearly all of the above points. On the other hand, there are ‘unskilled’ or new bug hunters who try to make some quick bucks by using one-click-tools and sometimes go as far as threatening the business owners. We refuse to call these people ‘bug hunters’,” they said. They enjoy bug bounty hunting because it gives them the freedom to break things whenever they want. “By submitting useful reports the chances are good that more and more companies will get the idea about responsible disclosure,” they said in calling bug bounty hunting the ultimate in crowdsourcing.


Security Skills Gap Continues to Stymie Enterprise Cyber-Defenses
"Good resources are scarce and you have to find new ways to provide needed security services," Chip Tsantes, chief technology officer of the cyber-security practice at Ernst & Young, told eWEEK. “You have to be more creative to find the skills that you need.” The lack of information-security professionals has been a common theme over the past five years. More recently, government hiring and the increase in the number of devices added to networks requiring security support has led to a continue shortfall in skilled security people, which Cisco estimates at 1 million workers worldwide.


10 Big Data Career Killers
Data scientists are in high demand. The Big Data market will grow anywhere from 20 percent to 40 percent annually through 2017, depending on the market forecast you trust most. But even an industry boom doesn't guarantee job security. Here are 10 missteps that can stop your Big Data career in its tracks. Note: Special thanks to Jack Welch, executive chairman of Jack Welch Management Institute at Strayer University. Taking poetic and editorial license, we adjusted his "10 Career-Killing Pitfalls" list to focus on the Big Data market.


Next-Generation Robot Needs Your Help
“It is very good idea,” says Bilge Mutlu, an assistant professor at the University of Wisconsin, Madison, who researches the interaction between humans and robots. “It’s a lot more flexible and adaptable to day-to-day environments.” Human-robot collaboration is already increasing in industrial settings (see “Increasingly, Robots of All Sizes are Human Workmates”). Finding ways for machines to collaborate in other settings could hasten the development of a new generation of service robot. “I am 100 percent sure that if people embraced robots with limitations we would have them in our homes as we speak,” Veloso says.


Chief data officer: My mixed and nuanced musings on the need for one
When people say that "data is the new oil," they're usually making a general statement on how deeply modern organizations depend on data to drive transactions, analytics and processes in general. It's not a statement about public sector institutions but about organizations of any sort. It's in that context that many organizations decide to appoint something called a chief data officer (CDO) to oversee this precious resource. If you want a deep dive into what the CDO role entails, I strongly urge you to download this excellent whitepaper from the IBM Center for Applied Insights.


Fifty Quick Ideas to Improve Your User Stories
Teams often struggle selling stories as small chunks of work that need to fit into a sprint. Business stakeholders simply don't care about that (fully justified), because this is purely technical. We end up coming back to organising things that are easy to develop, not that are valuable to a stakeholder. Small stories are good not because they fit into a sprint, but because an organisation can quickly get feedback from them. A story is supposed to deliver something valuable to a stakeholder, and if so, we should be able to decide if the work is really done or not from a business perspective, learn from that delivery and get ideas for future work.



Quote for the day:

"Ninety-nine percent of all failures come from people who have a habit of making excuses." -- George Washington Carver