Today’s enterprise is a federation of companies with vast collections of dynamic services that are enabled/disabled frequently with ever-changing sets of authentication and access control. To survive in this environment, a modern enterprise needs to develop an intimate yet secure ecosystem of partners, suppliers and customers. So unlike the rudimentary connectivity case, the typical production application is composed of many dozens and perhaps hundreds of services, some internal to an enterprise and some residing in a collection of external cloud infrastructures or data centers. For example, the incredibly successful Amazon ecommerce website performs 100-150 internal service calls just to get data to build a personalized web experience.
Scrum teams deliver a working and tested result every Sprint. So they don't just deliver the end result after a year, but a small extra step every month or less. But remember, they only deliver results that are truly finished! This expands your ability to steer so greatly that using a process of control loses much of its importance. Therefore, this can, for the most part, be done by the Scrum teams themselves. At the same time, it's essential that teams continuously improve. This is the critically important role that the manager plays: Helping the team improve by removing obstacles for them. You help create an environment that the Scrum team can work in. This means you should hold yourself back from intervening too much.
A trap many well-meaning but less experienced navigators fall into fairly often is to offer up advice as soon as that happens. Good navigators know when to wait a little bit before pointing out a missing semicolon somewhere, and will do it when there’s a natural pause in the driving. A very large number of interruptions rising from unfamiliarity of the driver might be a good indication that it’s time to swap roles, even if for a very short amount of time. For all the more interesting and more abstract issues, though, an experienced navigator is good at communicating intent – the what, not the how, and uses inclusive language (“us” and “we”, rather than “I” or “you”) as much as possible while at it, so the driver is invited to revisit some of the motivations behind intents they might not necessarily agree on.
Again, the majority of organisations cannot say for sure that adopting cloud services will result in savings because they don't know how much those services cost to run in-house. At best, the organisation might know the total cost of IT infrastructure, software and skills, and be able to roughly split that between the services provided. The decision to move to the cloud will therefore be based on this estimate; yet this ignores the fact both that the real costs are far more complex and that, by moving services to the cloud, you are not removing costs, but changing them. For instance, an organisation with its CRM service presently hosted in a data centre may decide to move CRM to the cloud.
While there isn’t much difference in how you develop the application itself, supporting IoT devices does require that software engineers become proficient with device-level application programming interfaces (APIs). IoT integration is all about APIs, the logical connectors that allow applications to communicate with each manufacturer’s IoT devices. APIs expose data that enables those devices to transmit data to your applications, acting as a data interface. Or, they can allow your application to control the device and serve as a function interface. While device manufacturers are taking steps to ensure that their APIs are well defined, developers must learn how to use IoT device interfaces effectively. Fortunately, third-party providers are also producing tools that make using each IoT device manufacturer’s APIs easier for developers.
Ongoing innovation and continuous feature updates are hallmarks of the platform business model. In The Cookie Dining case, the platform is expanding on a number of fronts. A feedback manager, which will let customers rate their food and delivery experience, is scheduled for release by the end of September. Integration with Yelp, which posts customer reviews of restaurants and other businesses, is also slated for September. Cookie is also at work on a point-of-sale (POS) system for in-store sales, Manojlovic said. Cookie POS v1.0 should be available for beta testing in December, he noted, adding that the idea is to unify "the whole sales experience for the restaurant."
The issue is that many makers of "things" still apply a traditional "box" mentality to products and do not consider the extra revenue opportunities of licensing-controlled embedded software and applications. Most of these companies are first-time software providers, mainly device manufacturers and OEMs that can now monetise their software as well as the devices via the IoT. For these companies, the IoT represents a significant market opportunity. “By monetising the software on their devices, these vendors will be able to increase and drive recurring revenue streams, creating billions of dollars of additional value,” Wurster adds. ... For the foreseeable future, Wurster believes the IoT will drive business transformation for many device manufacturers, enabling them to use software on the device to differentiate product and solution offerings.
The point about data algebra is that it genuinely represents data in a software compatible manner – any data. There is a back story to why this algebra was created. It was not a small effort, and it was years in gestation. In fact, Algebraix Data Corporation, founded by software engineers who believed a mathematical approach to data was possible, spent over six years creating, enriching and proving data algebra’s applicability. This was an extensive research activity that primarily involved using data algebra directly in a variety of data management activities: defining data, organizing data, querying data and optimizing the queries for performance. This was the focus of the research partly because it was decided that the best area to prove data algebra was in using it to manipulate and transform data in applications that did little else: database optimizers for data in both tables and graphs.
When the first system of record’s data meets the organization’s quality standards for that data type, the organization should build a real-time data quality firewall around it. With a data quality firewall, no matter where the data is coming from (online customers, a merchant, etc.), the firewall intercepts the data, cleanses it, and only then allows the data to enter the system of record. ... Profiling is both a technical challenge and management challenge. Questions will remain: How much more money should we dedicate to cleansing data? When is it clean enough? What return on investment do we need to make this particular cleansing process worthwhile? Again, these are strategic questions for the organization to evaluate as they weigh the importance of data sets.
"We have always expected business continuity and disaster recovery considerations to be incorporated in an institution's business model," the report states. "However, in addition to preparing for natural disasters and other physical threats, continuity now also means preserving access to customer data and the integrity and security of that data in the face of cyber-attacks." That's why FDIC says it "encourages banks to practice responses to cyber-risk as part of their regular disaster-planning and business-continuity exercises." The FDIC suggests that community bank directors use the cyber challenge program to openly discuss operational risks with their peers and employees and review the potential impact of cyber-attacks and other technology disruptions on their customers and operations.
Quote for the day: "Reduce the layers of management.They put distance between the top of an organization and the customers." -- Donald Rumsfeld
"Even the deals that do come will be smaller," wrote Ray Hennessey, editorial director of Entrepreneur.com. "Private-company valuations generally follow public-company ones. … If tech companies on the Nasdaq suffer a Black Monday, it will be a Grey Tuesday for private companies seeking venture money." Another factor that may impact businesses in the wake of this week's stock market instability is that in times of market uncertainty, people tend to cut back their spending, according to Hennessey. The connection between those factors and IT budgets? If your company is in the midst of raising funds and has to pay more to borrow money, it ends up in a price war with competitors; if your customers start curbing spending, cuts to company spending could be made, and it might be your 2016 IT budget that's on the list.
Increasingly, OpenStack is brought in for “onboarding a first software initiative or a particular business unit,” he said. “We see fewer and fewer people doing just experiments.” That’s not to say OpenStack has taken the world by storm. “Big rollouts require some serious spine from executives,” Ionel said, noting that OpenStack implementation is far from “frictionless.” The complexity of the framework is why Intel spearheaded the $100 million funding that Mirantis announced earlier this week — a follow-up to the other $100 million round announced last year. Intel wants to make OpenStack easier for the everyday enterprise to adopt, and it plans to collaborate with Mirantis on the necessary engineering.
Many companies have been using algorithms in software programs to help filter out job applicants in the hiring process, typically because it can be overwhelming to sort through the applications manually if many apply for the same job. A program can do that instead by scanning resumes and searching for keywords or numbers (such as school grade point averages) and then assigning an overall score to the applicant. These programs also can learn as they analyze more data. Known as machine-learning algorithms, they can change and adapt like humans so they can better predict outcomes. Amazon uses similar algorithms so they can learn the buying habits of customers or more accurately target ads, and Netflix uses them so they can learn the movie tastes of users when recommending new viewing choices.
The point of Big Data is that we can do novel things. One of the most promising ways the data is being put to use is in an area called “machine learning.” It is a branch of artificial intelligence, which is a branch of computer science—but with a healthy dose of math. The idea, simply, is to throw a lot of data at a computer and have it identify patterns that humans wouldn’t see, or make decisions based on probabilities at a scale that humans can do well but machines couldn’t until now, or perhaps someday at a scale that humans can never attain. It’s basically a way of getting a computer to do things not by explicitly teaching it what to do, but having the machine figure things out for itself based on massive quantities of information.
Every company embracing innovation does so in its own way. Johnson & Johnson, for instance, maintains several “innovation hubs” around the world, while Eli Lilly has endowed its own venture capital fund to fuel innovation efforts. The single quality these and other companies share is that they have created physical spaces in which to nurture new ideas. If innovation is the application of unorthodox thinking to business opportunities, the innovation lab is where that thinking evolves into new products, services, process efficiencies, partnerships, or business models. The hallmark of the innovation lab is that it is a space set apart—sometimes even isolated—from the rest of the company.
Two fertile decades of debate followed, giving rise along the way to entirely new conceptions of how the universe is built (black holes, it seems, are pretty fundamental components of it). As a new branch of physics called string theory found its feet, it turned out to be good at explaining the rules of order and disorder within the event horizon. And a consensus emerged that while his "Hawking radiation" story of evaporating black holes was correct, Dr Hawking's supposition about the loss of information was not. By 2004, he was forced to concede a bet on the outcome (the winner was to receive an encyclopedia, "from which information can be retrieved at will"). Information was saved. But how? It is that question that has preoccupied theorists, not least Dr Hawking himself, since then.
Turner suggested that biometrics should only be used as an authentication for local devices, which he said makes Apple's Touch ID unique and the "perfect way" of using biometrics. He said when a person's fingerprints are checked by the cryptographic chip on the Apple device, the information becomes linked to a person's Apple ID, but that information stays only on that particular device. According to Turner, this means if a person loses their Apple device, no one else can use the saved credentials from a different device. Turner made this observation in a discussion paper titled Consumerisation of biometrics will result in obsolescence, highlighting that most biometric deployments will "not be well executed, and the failures of these systems will impact the feasibility of biometrics as a means of authentication".
"The risk of being disrupted has never been higher and the time it takes for disruption to happen is shorter than ever," says Cox. "Organisations, therefore, need to be proactively disrupting themselves; challenging their business models, developing technology-enabled enhancements and alternatives to their products and services." ... In fact, such is the power of disruption that Richard Norris, head of IT and business change at Reliance Mutual Insurance Society Limited, says all CIOs must help their businesses to identify opportunities for innovation-led change. Norris implemented a digital innovation group at Reliance about three months ago. Drawing people from across the business, the learning group analyses how digital disruption can affect how services are taken to market
Instead of precision with defined schemas, NoSQL pioneers sought an ability to handle information at high volume and high speed. Instead of getting one transaction exactly right, they wanted to deal with a million users at once. NoSQL offered the sort of approach that a Twitter or Facebook might appreciate. And, in fact, those organizations quickly became big NoSQL users. Avinash Lakshman at Facebook was a pioneer involved in the formation of two NoSQL systems, DynamoDB during a prior stint at Amazon, and Cassandra at Facebook. For companies with robust public-facing Internet operations – such as social media, financial institutions, and retailers – customer service is a primary business driver for deploying NoSQL systems.
“At Facebook, culture is everything and it’s an incredible timesaver,” Campos said. Culture allows Facebook to cut through bureaucracy, he said. Among the ways Facebook emphasizes its culture is through its now well-known posters that say things like: "Fail harder;" "Move fast and break things;" and, "What would you do if you weren’t afraid?" Facebook also reinforces its culture through storytelling, like the “will you resign” email example he shared with the audience. “It was an incredibly powerful message,” Campos explained. “Everybody at the company read this email and had the exact same takeaway and perspective that I did, they all thought it was immediately addressed to them.
Quote for the day: "Successful people are interdependent, not independent" -- RichardWeylman
Until recently, DWA was associated mostly with automating ETL development – such as generating SSIS packages in the Microsoft environment. Today, however, it covers all the major components of data warehousing from design, development and testing to deployment, operations and change management. It also covers advanced functionality like support for slowly changing dimensions and change data capture. In our experience, DWA delivers up to 80% improvements in the cost-effectiveness of building and running a data warehouse. And, just as important, DWA is far better aligned with modern agile development practices because it encourages a rapid, iterative approach to design.
A recent meta-analytic study investigated the relationship between women on boards and performance, finding that women can make more of a difference in some countries. In countries with better shareholder protection, female board representation is positively related to profitability; in such contexts, greater gender diversity on boards ensures that the directors bring different knowledge, experience and values. In countries with greater gender parity, female board representation is positively linked to market performance. This relationship between female directorships and market performance is negative in countries with low gender parity.
The movie, which will be released Oct. 2, merges science fiction with actual science about Mars, technology that NASA is working on and the space agency's plans to send astronauts to the Red Planet in the 2030s. Jim Adams, NASA's deputy chief technologist, who has read the book, said he was impressed with the way the author represented the science and means of survival on Mars. "It stimulated a lot of my thinking about what we are doing and our plans on getting to Mars in the 2030s with humans." According to scientists at NASA, they already are developing many of the technologies that appear in the film. Here's a look at some of them.
Historically, data was used as an ancillary to core business and was gathered for specific purposes. Retailers recorded sales for accounting. Manufacturers recorded raw materials for quality management. The number of mouse clicks on advertising banners was collected for calculating advertisement revenue. But as the demand for Big Data analytics emerged, data no longer serves only its initial purpose. Companies able to access huge amounts of data possess a valuable asset that when combined with the ability to analyze it, has created a whole new industry. ITA Software is a private company that gathers flight price data from almost all major carriers with the exception of Jet Blue and Southwest that sells that information to travel agents and websites.
The problem is… that email wasn’t from your bank, and the link did not take you to your banking page. It took you to a fake website mimicking the real website’s look and feel, and you just gave the fraudsters the login details for your online banking. You did it because it looked real and you were scared that someone was going to take your money – but instead you walked straight into a trap. Sometimes the emails come with a phone number to call that lead you to an interactive voice system, just like your bank’s. You are asked to enter your bank account number and your sort code, and to divulge digits of your access code – little realising that you are giving this information straight to the criminals.
We spend hours looking at data about companies that are successful when they try to do major software implementation. The first thing is: it’s not for the faint of heart. There are so many things to consider. One of the interesting things we’re finding in the HR space, is more and more human capital departments, and HR departments actually running the software implementations. It used to be IT always bought the software. But now that Software as a Service (SaaS) provides different price point for things, we are seeing HR directors or professionals who are much more involved in running new HR software initiatives. Not only do we have more HR people, but we also have people who may not have actually ever managed an implementation before.
VR hasn't been completely dormant. "The folks who are just entering the field and are excited by the Oculus and the related technology product development are mistaken in thinking that what they're doing is new," said Linda Jacobson. Jacobson is the author of Garage Virtual Reality, a 1994 book outlining the past, present, and future of VR. She was one of the founding contributing editors of Wired Magazine, and a former virtual reality evangelist for Silicon Graphics. "What's new is this particular set of products at a new price point, as well as the availability of new people and new talent who are looking at it," she said. During those seemingly quiet years for VR, car manufacturers started using it to design cars and test user experience.
OKR (Objectives and Key Results) is a goal setting framework created by Intel and adopted by several Silicon Valley companies. Google is the most famous case, having adopted OKR in it's first year. Twitter, LinkedIn, Dropbox and Oracle are among other adopters. ... The main objective of OKR is to create alignment in the organization. In order to do so, transparency is key. OKRs are public to all company levels — everyone has access to everyone else's OKRs. All OKRs, including the CEO's, are usually available on the Intranet. OKRs exist to set clear priorities and to focus the organization. In order to to that, you should have few OKRs.
This may sound like a "let the markets decide" argument against unionization, but rather than the markets, it's up to the individual IT worker to ensure his or her interests are well represented and accounted for. Neither an uncaring marketplace nor a collective-oriented union can fully represent one's individual, rational self-interest. While there may be a line of people waiting to take my position, I maintain control over my skills and qualifications, and I will happily say "no" to an unreasonable demand as long as my skills and capabilities are appropriate for my job, and my performance is more attractive than that of the nearest competitor.
Enterprise architecture provides an Agile project with a vision in the form of principles and models. Agile provides Enterprise Architecture with a good set of principles, showing that a multidisciplinary way of working is key. Also, we can learn from the success of Agile and Scrum. If you look at them as architectures, they can even help improve the enterprise architecture profession. Organizations do need to ask themselves whether all architects they currently have will remain relevant. Some of what architects currently do (this holds especially true for solution architects) is now the responsibility of Agile teams. So what is the impact of this from a training and consulting perspective? The first thing is that both enterprise architecture and Agile remain relevant and people and organizations will require training and consulting in both.
Quote for the day: "It's not about how smart you are--it's about capturing minds." -- Richie Norton
The power of collective intelligence is that you get to these optimal solutions fast. When we first started holding these two-day sessions, the most common comment on the evaluations was, 'I cannot believe how much work we did in so short a period of time.' That's the function of having the network in the room. Nothing is as powerful as getting the whole system in the room because, as issues come up, you can say, how will this affect you? Even if the representatives are not the leaders of the group, it doesn't matter. As long as the voice is there, it seemed to work. By having them there, we could say, 'We can't stop until these four people are all comfortable with what we're going to do because all four people are impacted.' In hierarchies, you don't realize who is impacted until sometimes you're halfway through the project.
One reason that companies are unable to benefit fully from their investments in big data is that “management practices haven’t caught up with their technology platforms,” according to Ross and Quaadgras. For example, companies that have installed digital platforms, such as enterprise resource planning (ERP) systems and customer relationship management (CRM) systems over the past 10 to 15 years, haven’t yet taken full advantage of the information they make available. A cultural change is needed within companies so that “all decision makers have performance data at their fingertips every day,” Ross and Quaadgras write.
One key aspect of creating a conducive culture within an organisation is to be overseen by a board of directors that come from a diverse background. By introducing multiple perspectives in the mix dangers like ‘group think’, where one kinds of personality or way of looking at the world comes to rule the corporate culture, can be avoided. As a result, bringing in diversity, for instance wider female participation (with two thirds of companies actively seeking to introduce more women to the board), cultural diversity and other forms of diversity like social background, are growing in importance in boardrooms. In terms of female diversity, Eastern Europe comes out on top with nearly a quarter of executives being women, followed by Latin America.
Understanding context is key to applying these ideas but such situational awareness is a rarity in corporates. The lack of this causes visible symptoms such as poor communication, misapplication of doctrine (e.g. agile everywhere or six sigma everywhere), massive cost overruns in contracts, silos, duplication, constant reinventing of the wheel and a long list of other undesirable effects. I did want to write a post on the 61 different forms of strategic play and how to manipulate an economic environment but given the responses I've received from the Wardley mapping post, it seems something more basic is required.
One action that I advocate for IT leaders is to create the technology maps that their enterprises will need to negotiate today’s marketplace. Modern executives should never be surprised by technology. They might be disappointed by technology. Frequently they should be ashamed at their ham-handed, small-minded attitudes toward the adoption and deployment of technology. Some should be flogged publicly for their bordering-on-malfeasance inability to make money with the technology cornucopia that defines modern existence. But they should never be surprised by technology. Technology futures are knowable. Technology futures and possible technology opportunities need to be mapped.
Could this be true? What about the other 19%? Feeling a bit skeptical about what I was reading, I checked the research methodology, in particular, the demographics of the respondents: 814 IT security decision makers and practitioners, all from organizations with more than 500 employees. The respondents represented seven countries in North America and Europe and 19 industries. Seems pretty comprehensive. Another study performed earlier this year by Accenture titled Business Resilience in the Face of Cyber Risk, reported that: 66% of executives experience significant attacks on their IT systems on a daily or weekly basis; yet only 9% of executives run ongoing security penetration or continuity of business/disaster recovery tests on their systems.
Andresen’s gloomy prediction stems from the fact that Bitcoin can’t process more than seven transactions a second. That’s a tiny volume compared to the tens of thousands per second that payment systems like Visa can handle—and a limit he expects to start crippling Bitcoin early in 2016. It stems from the maximum size of the “blocks” that are added to the digital ledger of Bitcoin transactions, the blockchain, by people dubbed miners who run software that confirms Bitcoin transactions and creates new Bitcoin. Andresen’s proposed solution triggered an uproar among people who use or work with Bitcoin when he introduced it two weeks ago. Rather than continuing to work with the developers who maintain Bitcoin’s code,
What happens if an AI machine commits a crime? Who is responsible for the actions taken? This may sound like science fiction, but it has already happened. A Swiss art group created an automated shopping robot with the purpose of committing random Darknet purchases. The robot managed to purchase several items, including a Hungarian passport and some Ecstasy pills, before it was “arrested” by Swiss police. The aftermath resulted in no charges against the robot nor the artists behind the robot. How should an AI machine be regulated when it is acting on its own, outside the control of humans? There have already been several regulatory problems identified for controlling and regulating artificial intelligence.
Today, cybercrime costs companies more than $300 billion worldwide, and nearly all of it’s due to someone trying to steal credit cards, identity information, trade secrets, etc. Today’s hackers are all grown up and take the form of transnational organized crime rings, terrorist cells, hacking co-ops and groups and even nation-states and foreign intelligence services. According to Marc Goodman in Future Crimes, “The defender must build a perfect wall to keep out all intruders, while the offense need find only one chink in the armor through which to attack.” Make no mistake, these people are serious, they’re in it for the money, they’re organized and well-funded, they’re highly skilled, and they will find you.
The socialization of cyber threats among all levels of a company’s workforce reinforces the concept that cyber security is a shared endeavor. For example, social engineering and spearphishing e-mails that target one class of worker may not target another; yet it is imperative that everyone be cognizant of what they entail, how suspicious e-mails can be checked, and what should be done if they are received. This instills the knowledge that each employee has a vested interest in safeguarding the organization by ensuring its sensitive information and accesses are preserved and maintained. It’s imperative that accountability and responsibility must not be viewed projected as burdens that punish employees or risk the impeding business operations for the sake of compliance.
Quote for the day: “Ultimately, the only thing that matters is what we do for other people.” -- Daniel Vasella
ISOs provide a simple and elegant solution: a single point of contact for all of a 21st century company’s IT infrastructure requirements. These organizations operate globally, which means that companies that partner with ISOs will experience a consistent quality of service no matter where they are operating, or what kind of technology they are employing. In fact, by consolidating service contracts and streamlining IT maintenance processes, ISOs not only provide companies with reliable IT infrastructure sustainment, but also help their partners to enhance their own efficiency. Any global organization that requires hardware maintenance across a broad range of equipment makes and models will benefit from working with an ISO.
The IoT automotive industry is moving rapidly with many exciting growth opportunities available. We’ve written about some of the risks and benefits as well as some of the players involved. One thing for certain as that the auto industry is starting to take notice and we can expect the implementation of a number of new IoT technologies over the next several years. One of the largest and most critical investment strategies will be in IoT security ... For anyone looking for more information on the innovative uses on how the Cloud, Big data, IoT can scale and connect can get a better idea of the potential below. IBM has produced an excellent infographic centered around the opportunities.
Understanding the potential value of data consumes a lot of analysts’ time. For instance, an analyst for an auto manufacturer seeking to streamline its manufacturing processes would likely endure many false starts when exploring the mass of information related to the engine building process, from poorly scheduled lunch breaks to disconnect between suppliers. Utilizing big data discovery solutions can sort information potential, with the most interesting attributes appearing first. In addition, analysts can easily experiment with different combinations of data to understand correlations, so they can rapidly determine whether the data set is worthy of more attention.
A big part of a hybrid cloud is the ability to replicate and distribute data. First of all, it’s important to understand what you’re replicating and to where. Many organizations deploy hybrid cloud platforms to help get applications and data closer to their user. Others use a hybrid cloud to control bursts and branch locations. Regardless, it’s important to know how data is being moved, backed up, and how it’s being optimized. Data replication can be a tedious process if not done properly. That said, it’s important to take security into consideration as well. Your data is a critical asset and it must be secured at the source, through the route, and at the destination. Fortunately, virtual security appliances and services can help make this process a bit easier.
"If you want to create a workforce ... you want to create a talent pipeline, you cannot simply ignore half the population," said Memon. Beyond the U.S., other countries are leaving fewer women behind when it comes to computer science and engineering. In both Malaysia and Indonesia, women earn roughly half of the computer science and engineering degrees, while only one-fifth of those same degrees are earned by women in the U.S. Not only can women fill the estimated nearly 210,000 vacant cybersecurity positions in the United States, they can also bring new perspectives. "When you have a balanced team of both men and women, the teams are able to look at things a little bit differently and make sure that you're really looking at all causes, all effects and really get to the heart of the problem," said IBM Security's Westman.
If you're anything like me, you have mixed feelings whenever interns enter the equation. Who couldn't use more people to get things done, right? That said, how much disruption comes along with this brilliant idea? Well, that depends, but it's definitely not zero. Up until recently, data science teams have been reserved for the veterans — the brave, seasoned programmer/mathematicians who valiantly volunteered for the perilous role. However, the universities have quickly caught on, and they're rapidly minting fresh new data scientists who are eager to explore their new profession. That's where you come in to show them the ropes. Your boss thinks it's a good idea, and she's the only one that matters. It's up to you to make the most of the experience. Here are four key strategies for getting the most from your data science interns.
This idea of security through obscurity is worse in the hardware world and we’ve seen this with the Xbox for instance. So, the Xbox got hacked, the security keys on it got hacked and then everybody had open access to the Xbox. So I don’t believe that a security by obscurity will work in this case, but at the same time I am not yet convinced that the community is mature enough to act as a community. So in the software world we’ve had open source for 15 years, we have a lot of people who contribute best practice to open source. In hardware opens source is a pretty new concept and I think a lot of the people who are manufacturing devices and building IoT systems are not there yet in terms of sharing their best practice and working as a community in the same way the software world is.
The first technology that will be transfered by EY is called PathScan, which detects abnormal activity on networks that indicates the presence of hackers. Uncovering hackers on networks has been a struggle for many companies. On average, attackers operate inside a victim’s network for more than 200 days before being detected, according to FireEye Inc., a network security company. PathScan is being tested at five companies and already proving valuable, according to EY. The firm believes the relationship with the lab will be successful because technology being transferred has market value and will be combined with its other services and expertise, MacDermott said.
Most systems administrators tell Donnie Berkholz, a development, DevOps and IT operations analyst at 451 Research, that there is no such thing as a single pane of glass that works for everyone. "The idea should be to provide a single pane for a specific [person] in a specific situation," he said. For example, there is one view that IT pros may want during normal operations versus a project to troubleshoot and look for a root cause. "There is absolutely a desire to have a unified view integrating multiple data sources, given those constraints," he said. It's a different view of the single pane of glass that takes the uses into account.
With the release of Android 5.0, also known as Lollipop, Google introduced its new material design style. Material Design is a huge step forward for Android apps, bringing with it a total overhaul of theming, as well as a bunch of new UI components you can use in your apps. To make things even better, Google also released the Android Design Support Library, which allows you to take advantage of material design in versions of Android going all the way back to Android 2.1. ... Android has had the DrawerLayout component for some time now, which allows you to easily create "hamburger"-style menus in your apps. Hamburger-style menus have become ubiquitous in both Android and iOS in recent years.
“A lot of effort is put into setting up the initial relationship, but organisations typically select a supplier that is low-risk to begin with and there is no provision for monitoring how or if that changes,” he said. Wilkinson said organisations need to recognise a lot can change after a supplier is first selected, which means low-risk suppliers can become high-risk over time. “This is not a back-office operation that can be set once and work well for the next five years – you have to continually re-evaluate and re-assess as things change,” he said. According to a Booz Allen Hamilton report, the majority of third-party risk incidents at an organisation are likely to occur in an existing relationship.
Quote for the day: "Daring ideas are like chessmen moved forward; they may be beaten, but they may start a winning game." -- Goethe
What sets apart the CIOs who don't fit this pattern? Langer described 23 characteristics in his recent webinar, Strategic IT: The Transition Taking Place in the CIO Role. The material was based on research and interviews that he and his colleague Lyle Yorks conducted for their similarly named book. What the authors discovered is that the most successful CIOs have developedstrategy advocacy, or "a process through which technology leaders in organizations build on functional expertise." In other words, success in the CIO position has less to do with building their technology prowess and more to do with the ability to master other areas of expertise important to running a business.
"... a real-time system is one that behaves deterministically, responding predictably to inputs or changes in the environment. Typically these are cyber-physical systems, used to manage a physical process. "Observers often confuse real-time computing with high-speed computing, such as financial trading or sports betting," adds Barnett. "The difference between high-speed computing and real-time computing is that with high-speed computing you are talking about averages -- you can say on average an operation takes a millisecond. But one time in a thousand it takes much longer. With real-time computing you are confident the operation took place within the deadline, or you know it didn't happen."
Most of us have heard about Conway’s law. It claims relatedness of organizational structure (with its related processes) and produced system architecture - they go hand in hand. And that’s of course not a surprise. Consider a company with highly strict functional departments and lack of interdepartmental collaboration. Which kind of system would it produce? It would likely end up designing a set of isolated components, each exposing a unique and complicated interface. That’s an example of a causal connection between organizational structure and system architecture. What is actually interesting here is that this connection can be reversed! Meaning: you can influence changes in the organizational structures by reshaping your system architecture.
There has also been an evolution of the CISO, cyber gurus, and security management teams who feel they only need to understand the basic-fundamentals of what cybersecurity is, leaving the day-to-day interpretation for operational security to those lesser mortals who at times do their level best in the absence of any training, or real time investment. In fact, don’t take my word for it; look at some of those respectable organizations who have hit the press post some very successful compromises. Moreover, there are those who have suffered unauthorized incursions with the devil’s-luck of not being discovered, or suffering name and shame. On that subject, I have been unfortunate enough to follow some renowned CISOs in the industry into their departed organizations, only to find to my surprise fragile fabric of a security structure
The sensitivity analysis is an important step to evaluate the stability and hence the quality of our optimal solution. It also provides guidance on which area we need to invest effort to make the estimation more accurate. Mathematical Programming allows you to specify your optimization problem in a very declarative manner and also output an optimal solution if it exist. It should be the first-to-go solution. The downside of Mathematical programming is that it requires linear constraints and linear (or quadratic) objectives. And it also has limits in terms of number of decision variables and constraints that it can store (and this limitation varies among different implementations). Although there are non-linear solvers, the number of variables it can take is even smaller.
“There’s potential but the practical applications are still a little immature,” says Jon Oltsik, senior principal analyst at Enterprise Strategy Group. “You can tune something to look for an attack that you know about, but what’s hard is to tune it to something you don’t know about. I can look at access patterns on repositories and how much people download and whether they save documents locally. But there’s always creative ways to work around that. A really dedicated, sophisticated adversary will quickly decipher where you’re not looking – and that’s the problem.” Or they will carry out a “low-and-slow” theft by regularly moving data to a repository over time, he adds.
"When you put applications in more than one place, you have to synchronize data," said Phil Shelley, president of Newton Park Partners, a Chicago-area consulting firm. Getting that synchronization right isn't easy, he said. And the closer it gets to happening in real time, the more complex the challenge can become. The challenges of a hybrid environment arise around several key areas: data, timing and networking needs, as well as resource provisioning -- that is, getting the time, money and personnel needed to do the integration work. ... "It is a more complicated world when you start moving components of your IT stack outside. There are obviously benefits to that, but it is a more complicated world. It gets harder when one side isn't in your company," Doug Shoupp said. Sometimes, an API may be all that's needed, Shoupp said, but that is rare.
Lack of trust destroys your team. That we all know, but Wayde shares how that phenomenon affected one team he worked with, and some antidotes to that process. In this episode we also mention a book dedicated to highly functioning teams: Patrick Lencioni’s The Advantage, and share 2 games you can play with your team to grow trust. Wayde is an Agile coach with TeamFirstDevelopment.com. He is interested in helping teams improve using the same techniques that Improv theater teams use to develop Great Team Players.
Clearly, businesses need to step up their assessments of third parties and supply chain partners. It is also essential that they stipulate the right to assess a supply chain partner’s security capabilities in contracts. Experience shows that organizations that do not legally plan for due diligence when executing contracts may not be allowed to perform adequate assessments when necessary. Also consider that as much as 20 percent of security spending is estimated to occur outside of the information technology (IT) function on services like cloud computing. Contracts executed outside of IT may not allow for due diligence and, in fact, they may require important information security and privacy safeguards.
It also means cyber professionals are hopping from one job to another, leaving gaps in how their systems are protected, also increasing the likelihood of attacks. Finally, businesses are forced to train or hire unqualified employees to fulfill their cybersecurity needs. It’s no wonder 86 percent of organizations believe there’s a shortage of skilled cybersecurity professionals and just 38 percent believe their organization is prepared for a cyberattack, according to a January survey from ISACA, an international professional association focused on IT governance. The fear crosses over to government agencies as well, as we’ve seen with several high-level cyberattacks. For this reason, President Obama has been quietly recruiting top tech talent from companies such as Google and Facebook to increase the number of qualified cyber talent in Washington.
Quote for the day: “Never follow anybody who hasn't asked "why" -- Aniekee Tochukwu
In this study, four efficient tools for analyzing patent documents were tested: Thomson Reuterís Aureka and Thomson Data Analyzer, Biowisdomís OmniViz, and STNís STN AnaVist. All four tools analyze structured and unstructured data alike. They all visualize the results achieved from clustering the text fields of patent documents and either provide basic statistics graphs themselves or contain filters for performing them with other solutions. The tools were tested with two cases, evaluating their ability to offer technology and business intelligence from patent documents for companiesí daily business. Being aware of the state of the art of relevant technology areas is crucial for a companyís innovation process.
Mesos, the kernel of the Mesosphere DCOS, is a 6-year-old Apache open-source project, conceived at the University of California, Berkeley, that was announced as a joint collaboration with Mesosphere at DockerCon EU in December 2014. The company has come a long way in the nine months since then, as more and more enterprises retool their data centers to run DCOS. Mesosphere DCOS is a highly scalable engine that enables the running of services and applications across a cluster of machines in a data center or cloud. It is highly container-driven. It combines the Apache Mesos cluster manager with a number of open-source and proprietary components and allows services to be deployed and managed through both a custom Web UI and command-line interface.
Based on the Gen5 platform architecture that Brocade uses for its storage and networking products, the Brocade Analytics Monitoring Platform comes in a 2U form factor that can be configured with up to 24 Fibre Channel ports. The appliance itself sports two dedicated multi-core processors for frame processing and an onboard solid-state disk drive. From a software perspective, it runs an implementation of Brocade’s Fabric OS (FOS) that includes analytics capabilities and can be integrated with Brocade Network Advisor software. Rondoni said IT organizations can use the platform to generate customized reports to correlate and summarize trends and specific events.
In-room tablets at hotels serve as media hubs, control centers, and information desks. Guest enters the room - curtains open, music plays and climate control switches on with option for guests to personalize these settings. Guests leave the room and the settings are reset to defaults, but guests' personalized settings can be saved and automatically applied upon future visits. Bathroom mirror as interactive display for news, weather and messages with ability to pair with smartphones or tablets. Eletro responsive fibers in pillows monitors blood pressure, sleep patterns and stress levels.
The value and power of a data lake are often not fully realized until we get into our second or third analytics use case. Why is that? Because it is at that point where the organization needs the ability to self-provision an analytics environment (compute nodes, data, analytic tools, permissions, data masking) and share data across traditional line-of-business silos (one singular location for all the organization’s data) in order to support the rapid exploration and discovery processes that the data science team uses to uncover variables and metrics that are better predictors of business performance. The data lake enables the data science team to build the predictive and prescriptive analytics necessary to support the organization’s different business use cases and key business initiatives.
Windows 10 is Microsoft's effort to recapture many enterprise users who balked at Windows 8's mobile-focused interface and to finally move the last Windows XP and Vista holdouts onto a newer OS. Beyond the new features, security updates, and its platform-unifying design, Windows 10 marks a significant shift in how Microsoft's characterizes its flagship operating system. Microsoft is encouraging people to think of Windows 10 as a "service". Instead of releasing a new numbered version of Windows every few years, the company will continuously release new features and updates. Microsoft has committed to support Windows 10 for a decadeafter the July, 2015 launch.
Companies that know and understand the similarities and differences across their information, data and storage media, along with their associated lifecycle management and tiered protection, can unlock value while removing complexity and costs to sustain growth. Organizations should start by revisiting information,data and storage media management, along with their corresponding lifecycles. Then, they should focus on what can be accomplished today in comparison to how processes worked in the past. This will allow companies to distinguish between their needs and wants. Businesses can then begin to remove costs by finding and addressing data protection complexities at the source, as opposed to cutting service.
Analytical models have greatly evolved both in the depths of the mathematical techniques as well as the wide-spread application of the results. The methodology to create analytical models, however, is not well described, as can be seen by the fact that the job of the analytic practitioners (currently called data scientists; older names are statistician, data analyst and data miner) involve a lot of tacit knowledge, practical knowledge not easily reducible to articulated rules2. This informality can be seen in many areas of analytical modeling, ranging from the project methodology, the creation of the modeling& validation data, the analytical model-building approaches to model reporting. The focus of this document is project methodology.
Services were modeled based on business capability model and the first release went well. They were XML over JMS sync services and primarily focused on delivering the capabilities required for claims platform exposed to Agents, web and voice channel application. It gave us the ability to deploy frequent, small changes and A/B feature support seamlessly for our applications.When the requirements were incrementally added (and they always were) it was very hard to release the solution rapidly because of the integration complexity between applications and the consumers. Integration, functional testing, and production release required tight coordination.
It’s important to quickly understand that cloud computing isn’t going anywhere. In fact, the proliferation of cloud computing and various cloud services is only continuing to grow. Recently, Gartner estimated that global spending on IaaS is expected to reach almost US$16.5 billion in 2015, an increase of 32.8 percent from 2014, with a compound annual growth rate (CAGR) from 2014 to 2019 forecast at 29.1 percent. There is a very real digital shift happening for organizations and users utilizing cloud services. The digitization of the modern business has created a new type of reliance around cloud computing. However, it’s important to understand that the cloud isn’t just one platform. Rather, it’s an integrated system of various hardware, software and logical links working together to bring data to the end-user.
Quote for the day: "The final test of a leader is that he leaves behind him in other men, the conviction and the will to carry on." -- Walter Lippmann
Historically, trying to measure "software quality" has been tricky because we've tried to measure attributes of the code, and the team delivering the code was not actually responsible for providing the ultimate customer-facing service. Personally, I think the only metrics that really matter are those related to the "consumer experience" of the system: percentage of successful API calls responded to in a reasonable amount of time, number of customer purchase transactions, number of applications successfully processed, etc. Of course, it's only fair to start measuring a team on these metrics if the team has a reasonable degree of influence on them. So, to some extent, this approach implies "DevOps" or "product teams" or whatever we want to call them.
The disruptive technologies of cloud-based applications, delivered through browsers and apps to a variety of devices, are all part of the external environment and linked to the role of front office. New business models are focused on taking these external capabilities and redefining how to find, win and deliver new forms of competitive offerings. Front office environments are focused on people who create value through external interactions to win and deliver business, people working Outside-In. This is unlike the back office where the focus is on process removing people and cost. Outside-In technologies enable the people in the front office to find and share the resource they need to improve their performance within these new business models. “The Future of Work” is a term used to describe the manner in which these new technologies are deployed in new optimal ways.
In conjunction with mobility, big data is changing the way patients engage with their doctors and experience their treatment. Research has found that three out of five patients would choose telehealth visits over in-person appointments for minor check-ups and follow-ups. In PwC’s survey, more than 50 percent of respondents would feel comfortable sending a digital photo of a rash or skin problem to a dermatologist for an opinion. Not only is the technology for “virtual treatment” available, but 64 percent of surveyed patients expressed their willingness to adopt new, non-traditional ways of seeking medical attention. In a world where services are available in an instant, doctors must start treating their patients as a customer to continue to meet their needs. That includes opening the line of communication or easier visits and quicker treatment.
What makes new security risks particularly challenging is their fluid and dynamic nature; the rapid rate of change has proven to be increasingly difficult for organisations to keep up with. “It’s somewhat like being in a submarine with leaks that pop up in random places at random times”, Booch explains. “You have to be vigilant about not just reacting to security threats – any company has to be diligent about keeping up with the latest patches and attending to zero day exploits – but also to be proactive in seeking out potential risks”. The traditional and perhaps even stubborn mind-sets of those in the IT sector are slowing down progress in cyber securitisation, so accustomed are people to protecting their businesses and assets in a certain way. Yet this rigid approach is no match for hackers.
Among other things, the ability of Clear Containers to run on Rocket affirms CoreOS's design choice to map different "stages" for different operational characteristics for a container. CoreOS also implemented "pods" with its runtime. Pods allow multiple containers to function as a single logical service, even if the containers have been spread over multiple hosts in a cluster. ... "For the little function you need, you don't need the full QEMU layer," Sousou said, referring to the code for the emulation of a complete x86 machine that's part of a hypervisor startup. Intel stripped QEMU out of the KVM initialization process, along with multiple other minute adjustments, to take milliseconds out of the startup process.
That “hero vs. zero” attitude has shifted considerably in the past few years as the relationships between the CIO and CMO has matured, says Tom Litchford, vice president of retail technology at the National Retail Federation. “The whole idea is that the CIO and CMO really have to be attached at the hip,” he says. “As we go forward, there is less of the old feeling that “all I ever hear from IT is ‘no.’” The Forrester/NRF study reported improved relationships between the retail CIO and line-of-business colleagues such as the CMO. ... These issues go beyond technology into fundamental issues related to marketing and the entire organizational structure, so CIOs and CMOs must each bring their separate strengths to the table.
Cybersecurity is a new issue for the industry, one handled by automakers in different ways. That varied and still-developing approach has fueled industry critics, including some lawmakers, who say the industry lacks a comprehensive solution to safeguard their customers. The immediate threat of malicious hackers wreaking havoc on connected cars appears to be relatively remote. The researchers who remotely controlled some Jeep Cherokee vehicle systems ... were highly sophisticated security experts who spent years developing the tools needed to complete the hack. Hackers seeking monetary gain have little current incentive to target cars. Even though vehicles can collect huge amounts of data, the auto industry has yet to monetize it in a major way
The cardless ATM technology is the latest attempt by banks to persuade customers under 35 to open an account with them instead of migrating to their traditional competitors or the latest Silicon Valley startup that promises to help consumers borrow, manage, and invest money through their phones. Hudson-based Avidia Bank said earlier this week that it had introduced the new technology to the ATMs at its eight branches in Central Massachusetts. Salem Five Bancorp launched cardless ATMs this month at its 30 ATM machines, primarily on the North Shore. Twenty banks across the country, mostly regional and community banks, also have gone mobile, although the ATMs still accept traditional debit cards, said Doug Brown, senior vice president and general manager of mobile at FIS, the Florida banking technology firm that makes the mobile software for the ATMs.
Private and hybrid cloud implementations of data and analytics often coincide with large data integration efforts, which are necessary at some point to benefit from such deployments. Those who said that integration is very important also said more often than those giving it less importance that cloud-based analytics helps their customers, partners and employees in an array of ways, including improved presentation of data and analytics, gaining access to many different data sources and improved data quality and data management. We note that the focus on data integration efforts correlates more with private and hybrid cloud approaches than with public cloud approaches, thus the benefits cannot be directly assigned to the various cloud approaches nor the integration efforts.
Quote for the day: “If it involves technology it is your fault if it breaks, The CIO should have seen it coming.” -- Earl Perkins
The first thing insurers should realize is that this is not an arms race. The winners will be the ones that take a measured and scientific approach to building up their machine learning capabilities and capacities and – over time – find new ways to incorporate machine learning into ever-more aspects of their business. Insurers may want to start small. Our experience and research suggest that – given the cultural and risk challenges facing the insurance sector – insurers will want to start by developing a ‘proof of concept’ model that can safely be tested and adapted in a risk-free environment.
Intuit promised that it would continue to maintain and develop Quicken until it finds a buyer, adding that it plans to release the next edition, Quicken 2016 for Windows, and would keep working on the Mac version. Current users should see no interruption in their ability to use the software or its associated services, such as Quicken Bill Pay. "As we move through this sale, it's business better than usual," wrote Eric Dunn, who heads the Quicken unit, in an online statement. "As a standalone business, we'll focus solely on taking Quicken to the next level. And until we find that buyer, we'll continue to provide you with [the] dedicated, uninterrupted service and support you deserve."
Cybersecurity is no longer an emerging issue. Major headlines about breaches in both the public and private sectors have put the topic on every company’s agenda, regardless of size, industry or geographic location. Just like the management teams they oversee, corporate directors are very well aware of the ‘what’: the fact that cybersecurity is a significant threat. What they are looking for is the ‘how’: specific action they can take to be more effective in overseeing management’s activities. One independent director – a committee chair of a Fortune Global 100 corporation – recently told me: “Cybersecurity is uncharted territory. As directors, we have to depend on staff whose capabilities we aren’t equipped to judge, it’s difficult to measure progress and there’s no way to tell if we’re doing enough.”
To apply Huffman adaptive encoding to the string we normally need to iterate through the string of character and perform encoding for each character in string. The main idea of the Huffman adaptive algorithm is that the encoding is initially performed starting with the “empty” Huffman tree, which contains no entries for the characters to be encoded and will further be modified by appending the new characters along with their codes during the encoding process. According to the basic concept of the following algorithm, the Huffman tree should be modified similarly during the either encoding or decoding process, because, in both cases, we need to generate the same codes for each character from the input character buffer regardless of whether encoding or decoding is performed.
Agile is a method that is highly dependent on individuals and the way that they approach their work. It requires participants to take on new roles that they would normally not adopt. Leadership roles move from person to person, and each must have the freedom to commit to the team. These commitments often cross organizational boundaries. In a traditional development environment, managers set priorities and deadlines, but in an Agile environment, managers shift to a facilitation and enabling role. Managers become channels for success, creating new lines of communication and business relationships. They need to move away from their traditional command and control role. In an Agile environment, personal success is highly dependent on team success.
One of the great engines of change in the software profession has been Agile. Not only has it transformed the way that development teams work, but it has had profound ripple effects across the entire software value stream. Agile is far more than a difference in batch size. The challenges Agile posed to traditional assumptions about planning in the face of uncertainty, the centrality of the team, the delivery of value, and other fundamental issues have affected everything from the inception of an idea to its eventual retirement. Testing, requirements gathering, rapid and continuous delivery, governance rules, customer collaboration, marketing, change management — all of these activities within the value stream, and more, have had to adjust to Agile.
We see new systems and technology being dropped into the business – often ‘point solutions’ to solve a particular problem – without a proper diligence process to sense check for wider synergies. Time and again readiness assessments, training and business change are not well executed. This means that the business is not ready and new ways of working don’t get introduced. The net result is that the business stays in its comfort zone and introduces workarounds so it can maintain old practices – it fills gaps with manual processes and spreadsheets and does not use the new systems to enable the changes and release benefits.
If the issues are spotted early, organisations can review the specification documents and rectify the project’s direction to ensure it meets the true requirements. With a fragmented process, companies can also find that on-going work causes issues in deliverables that have already been signed off and leads to live software breaking. Having good levels of communication and working with a third party are great ways to prevent problems occurring in the first place. The team can also increase success by employing an Agile project management approach that enables the team to gather stakeholder and user feedback on the product from the very beginning.
"We want reusable highly scalable and flexible platforms," Dyson says. "We want to strengthen our cybersecurity and continuous monitoring posturing -- that's very important whether we're working on-prem or in the cloud." The determination of whether or not to roll out a SaaS application at the SEC "has a lot to do with timing," Dyson explains. The regulatory agency is charged with drafting and implementing rules for the securities industry, a process that is guided by deadlines mandated in statute or by the commission's own timetable. In that context, the consideration of a cloud deployment can become a question of whether or not the technology will support the agency's regulatory mission.
The use of Innovation Gameswas suggested as a way of helping the executives understand the new ideas, as well as using pilot programs and providing internal proof by starting small and showing the benefits. An issue which was specifically identified was the fear of middle managers in many organisations as they see the significant changes that agile adoption results in for their roles. A common pattern is the reduction in the number of middle manager roles and the migration into more hands-on roles such as Scrum Master or Product Owner. There needs to be a clear transition path for these managers and the importance of strong executive support to overcome this resistance was strongly emphasised.
Quote for the day: “Leaders are people who believe so passionately that they can seduce other people into sharing their dream.” -- Warren G. Bennis
The iPhone is popular, but it still only accounts for about 14% of worldwide smartphone market share as compared to Android’s 79%. In actual numbers, 1.1 billion Android-based phones are expected to ship in 2015 vs. 237 million iPhones. When I search Verizon Wireless’ website there are 9 iPhones available as compared to 29 Android options made by six different vendors from Google to HTC to Motorola to, of course, Samsung. Android is an operating system that can work on many different devices whereas iPhone’s operating system iOS only works on devices made by Apple. Even though I’ve decided to get another Samsung I like having the flexibility to choose other hardware devices that fits my and my company’s needs and I don’t get that with Apple.
There's no doubt that Hadoop has a place in the enterprise, especially as big data applications take hold. But the venerable EDW has a well-established presence in data centers, and after years of refinement plays a significant role in meeting the reporting and analytics needs of most organizations. Does the emergence of Hadoop mean it's time to abandon the EDW? Some IT and data management professionals are aching to use Hadoop as a replacement for the data warehouse -- but are companies really prepared to abandon their decades-long investments in EDW infrastructure, software, staffing and development?
If you don’t take the time to check up on your mentees and listen to their concerns, travails, and triumphs, then you will have no metric for achievement. Employing agility as a mentor requires sensitivity, creativity, and solid communication skills. It also requires the foresight to see what your mentee should be aiming for, and the hindsight to see what your mentee has already accomplished. To establish a framework for gauging your mentee’s progress, consider the three phases every new team member goes through in some form. The first phase is total unfamiliarity and constant discovery; the second is a transitional period with a clear trajectory of progress; and the third is self-driven competence. In all three phases, remember that agility remains your most vital tool.
According to a June study by Fidelis Security and the Ponemon Institute, 26 percent of board members admit to "minimal or no knowledge" about cybersecurity, and only 33 percent say that they are "knowledgeable" or "very knowledgeable." ... 70 percent of board members said that they understand the security risks to the organization, but only 43 percent of IT security professionals agreed that the board understood the security risks to the organization.. Only 18 percent of IT security professionals rated their companies' cybersecurity governance practices as very effective -- compared to 59 percent of board members. This is a difficult communications gap that needs to be addressed on both the board level and by CISOs themselves.
Two major problems exist for two different classes of websites. First, for larger websites that use many third-party services (ad networks, CDNs, etc.), all of those services need to support HTTPS before the main website can switch to HTTPS. Slowly, these services are starting to support HTTPS, which means it will be easier and easier for larger websites to switch to HTTPS. Second, for smaller/non-profit websites the process of getting and installing an HTTPS certificate is a pretty confusing process. New tools like SSLMate and Let's Encrypt are starting to make that process easier and more automated, so that making your small website HTTPS is a fast and easy process.
Executives who think they're in a technology arms race are focusing on the wrong area: The 2015 Digital Business Global Executive Study and Research Project byMIT Sloan Management Review and Deloitte identifies strategy, not technology, as the key driver of success in the digital arena. Conservative companies that avoid risk-taking are unlikely to thrive — and they'll also lose talent, as employees across all age groups want to work for businesses committed to digital progress. The report is available online and as a PDF, and the online version includes a Digital Business Interactive Tool with interactive charts to explore the data set.
Some security experts have noted that the breach could be a lot worse, at least in terms of compromising credit card information. According to Robert Graham's security blog: "Compared to other large breaches, it appears Ashley-Madison did a better job at cybersecurity. They tokenized credit card transactions and didn't store full credit card numbers. They hashed passwords correctly with bcrypt. They stored email addresses and passwords in separate tables, to make grabbing them (slightly) harder. Thus, this hasn't become a massive breach of passwords and credit card numbers that other large breaches have lead to. They deserve praise for this." However, the account names, street addresses, email addresses, and phone numbers used to register for the site were not encrypted.
"When they target somebody, they have to set something up so maybe they'll send out an e-mail that says, 'Your PayPal has been compromised' or 'Your e-mail has been compromised.' ... "The hackers may not even say that the victims' e-mail has been compromised. They may just say, 'You've been locked out of your e-mail' or 'There's some maintenance that needs to be done on the e-mail server' or 'Click here for new information.'" Barney says never click on an unfamiliar link. Often, such links will lead to a site designed to look like a legitimate, trusted site but will have a slightly different Web address. Other times it may take the user to a blank screen. Either way, the hackers' goal is to gather information that will help them steal valuable data.
Take a look at the Apache Software Foundation's (ASF's) list of projects and you may feel overwhelmed. Between top-level and incubating projects, there are far too many to keep track of. Filtering down the list to Big Data projects may not help, because that "smaller" list is still quite long. And don't forget that there are several noteworthy open source projects that aren't even under the ASF umbrella to begin with. So, in the name of helpful triage, here are five projects to keep an eye on:
"Complexity is the result of a diversity of footprints, of tools, of workforce," says Christopher Rence, CIO of Digital River, a provider of e-commerce, payments and marketing services for merchants. Rence knows whereof he speaks: He's lived through three acquisitions in the last four years, and has seen the residue of the 20 acquisitions that the company has experienced since 1994. "One company we acquired had nothing but white-label hardware. It didn't have an asset value, but it was doing a lot of processing," Rence recalls. In preparation for conducting a strategic migration of the data through a gateway into a SaaS solution, "we had to do a full inventory of what those homegrown products were doing," says Rence. "It required understanding some of the undocumented knowledge.
Quote for the day: “Whether driven by ambition or circumstance, every career gets disrupted.” -- Jay Samit