In the world of big data, a visualization is merely a vehicle – a vehicle for us to create patterns, familiarity, and salience with data so that we can attract users’ attention and tap into their iconic memories, but to convince them to take actions, we must think deeper and tap into their short-term and long-term memories: Who is my audience? Why should they care? Will I make their jobs easier and help them create more impact? With this framework in mind, let’s look at the two data visualization examples below and see which one is more effective? For illustration purpose, let’s assume that the user for this data visualization is a project manager at an IT Consulting Agency. Her performance is measured by the number of projects she does and how quickly she delivers solutions to her customers. To achieve high impact, she constantly looks for areas that hinder her effort or projects that drag down her performance. ... The magic of action-driven visual analysis is never about the beauty of the chart, but rather the thought process that goes behind it: identify what is important for your audience, and then use visualization tools to surface what they care about.
Quad9 blocks against known malicious domains, preventing your computers and IoT devices from connecting malware or phishing sites. Whenever a Quad9 user clicks on a website link or types in an address into a web browser, Quad9 will check the site against IBM X-Force threat intelligence that includes 800+ terabytes of threat intelligence data including 40B+ analyzed web pages and images and 17 million spam and phishing attacks monitored daily. Advanced analysis is performed on IP addresses to assign a risk score based on text, visual object recognition, optical character recognition (OCR), structure and linkages to other sites, and the presence of suspicious files to identify malicious IPs. This data feed combined with multiple other threat intelligence providers allows Quad9 to block a large portion of the threats that present risk to end users and businesses alike. It’s worth noting that Quad9 doesn’t just use IBM’s threat intelligence – there are 18 other combined feeds that make up their threat blocking, which is fairly unique and gives a cross-section of blocking abilities from some of the world’s best threat management organizations.
Whatever business you are in, it’s likely that somewhere in your organization there is a person or team responsible for project management. If they are handling multiple projects, juggling the grouping of projects, overseeing work flows and allocating tasks, their job becomes more about resource planning. That means they need a firm grip on who is doing what, where, and when – and must determine whether all resources are being used in the smartest way. They can use a variety of tools to help organize the resources at their disposal, which may be people, equipment, machinery, or office space. Some people schedule resources using Excel spreadsheets and an assortment of other unsophisticated tools, including calendars, whiteboards, and notepads. Whether these choices are made because of economy, or lack of knowledge of a better alternative, their failure to use specialist resource planning software is probably costing them time, money and the respect of team players and senior management.
If someone truly means Enterprise Architecture as the roles, processes, value streams, business capabilities and ecosystems for your manufacturing, supply chain, finance, human resources, or everything your company does, then absolutely Business Architecture is part of that. It’s about translating from the vision and strategy level in Architecture through an understanding of the business needs and gaps before then architecting, or guiding, specific solutions, where solutions include process, roles, systems, information flow, and the technology. In that context, we’ve been building Business Architecture into the TOGAF ecosystem with a set of guides on how to apply Business Architecture as a very strategic tool and methodology. This enables the architect to flow from strategy around the ADM wheel to drive to the right solutions. The case study we presented uses those same Business Architecture methods, following step-by-step, including examples in Raytheon for Sales and Operations Planning. We walk through business modeling at the Vision Phase, then through value-stream analysis and business capability mapping in the Business Architecture phase.
It is a future where individual consumers sit at the center of their personal worlds and access the services that fit best into their lives thanks to the data about themselves that they choose to share with brands that they trust. Moreover, we are talking about trust on a personal, emotionally-engaged level. Not just the trust we have with a utility-style process that will work the exact same way the next time we need it. Although banks hold huge quantities of transactional data on millions of customers, they already face serious challenges to maintain the quality of that data and the way the data is used on behalf of the consumer. As customers turn to new payment methods, banks progressively lose the granular detail they used to have about their customers’ spending. Instead, they see a stream of transactions where anyone but the banks ‘owns’ the relationship: ApplePay, PayPal transfers, a direct debit to a Nutmeg or Betterment account, or storing value on a Starbucks mobile app … with rewards associated with many of these relationships. As such, a consumer can leave a bank in every way that matters without closing their account.
The public mistrust of companies is also part of a longtime trend, one of declining respect for all institutions, not just corporations. According to Edelman, the mistrust of media is even greater than the mistrust of business. In the 28 geographies that Edelman surveyed, the overall trust for institutions accrued most to NGOs, then to business, then to government, and finally to media. In 21 of these geographies, business is more trusted than government. In that context, when it comes to dealing with social issues and fostering overall economic growth, people around the world increasingly expect business to step up to the responsibility. Other institutions have lost their license to lead; they aren’t seen as capable of making the right things happen. Two possible reasons for this shift in attitude come to mind. First, the private sector is now seen as an effective actor when business leaders choose to participate in solving the pernicious problems of our time: cybercrime, terrorism, the threat of nuclear war, income inequality and its political impact, and environmental damage.
Microsoft's new .NET Core 2.0 and .NET Standard 2.0 offerings may be generating the most buzz among .NET developers these days, but for many use cases the traditional .NET Framework is still the best choice, just released in version 4.7.2. The new .NET Framework 4.7.2 is the next major update following the October 2017 release of v4.7.1, which added support for .NET Standard 2.0, defining APIs that all conformant .NET implementations must provide to ensure consistent API usage across development projects, replacing the previous Portable Class Libraries (PCL) as the means to create libraries for all scenarios. While .NET Core offers cross-platform functionality and more, the 16-year-old .NET Framework is still an optimal choice for targeting Windows desktop projects such as WinForms, WPF and ASP.NET WebForms apps. Both .NET Core and .NET Framework are used for creating server-side apps ... Microsoft also provided guidance about when and when not to consider porting existing .NET Framework projects to .NET Core.
It's a step up from waterfall, where processes add significant time to project completion in the name of stability. But DevOps itself is already wrong for the modern world: Streamlined processes between development and operations are only useful if the outcome supports the business. DevOps does not solve the problem of IT effectiveness, wherein IT must not just work quickly, but also must stay attuned to business requirements and project goals. DevOps is better christened BizDevOps, as everything that happens must be driven by the business. Development teams can act too selectively: Instead of focusing on an issue identified by operations as critical, they spend time on technically interesting and intellectually challenging tasks that are less pressing. The standard help desk feedback loop system, wherein operations and users raise issues in production, is wrong for DevOps. A DevOps feedback loop enforces priorities and project goals so that the freedom and fast pace in development doesn't lead it astray.
"It was scary for a lot of people," Thompson said. He ultimately reassigned those workers to engage with the company's clients and perform other higher value tasks. "Our business leaders are coming along for the journey," Thompson said. "They didn’t think these things were even possible and we’re now showing them the art of the possible." CIOs aren't looking to shed staff so much as free workers up for other work. To that end, bots are a big part of the plans for Walmart, which employs 2.3 million people. Walmart CIO Clay Johnson, who spoke on the panel along with Thompson, said the retail giant has deployed about 500 bots to automate anything from answering employee questions to retrieving useful information from audit documents. "A lot of those came from people who are tired of the work," Johnson said. Freeing up staff is part of Johnson's process automation plan to make Walmart's massive workforce more efficient. More broadly, Johnson's IT strategy entails delivering IT services as a series of products rather than traditional IT project management freighted with set deadlines and rigorous processes.
We don’t describe TigerGraph as an in-memory database, because having data in memory is a preference but not a requirement. Users can set parameters that specify how much of the available memory may be used for holding the graph. If the full graph does not fit in memory, then the excess is stored on disk. Best performance is achieved when the full graph fits in memory, of course. Data values are stored in encoded formats that effectively compress the data. The compression factor varies with the graph structure and data, but typical compression factors are between 2x and 10x. Compression has two advantages: First, a larger amount of graph data can fit in memory and in cache. Such compression reduces not only the memory footprint, but also CPU cache misses, speeding up overall query performance. Second, for users with very large graphs, hardware costs are reduced. For example, if the compression factor is 4x, then an organization may be able to fit all its data in one machine instead of four. Decompression/decoding is very fast and transparent to end users, so the benefits of compression outweigh the small time delay for compression/decompression. In general, decompression is needed only for displaying the data.
Quote for the day:
"Wherever there is authority, there is a natural inclination to disobedience." -- Thomas Haliburton