Habits, it seems, get in the way of change despite our best intentions. “Habits are triggered without awareness — they are repeated actions that worked in the past in a given context or in a similar experience,” she notes. Wood’s research shows that concentrating on changing unwanted behaviors, and then creating new ones — not focusing on motivation — is the key to making change. She cites various efforts aimed at changing smoking habits in the U.S. from 1952 to 1999. Smoking decreased not when smokers were made aware of the health risks, but when buying and smoking cigarettes was made more difficult and less rewarding. Thus, higher taxes, smoking bans in public places, and limits on point-of-purchase ads — which add friction to smoking — were a more effective deterrent than warning labels on cigarette packages and public service advertising about smoking’s negative effects. A similar strategy of changing the context is possible in the workplace: Make old actions more difficult; make new, desired actions easier and more rewarding.
Before an IT group blithely answers the call for a collaboration system – by which we mean groupware applications such as Slack, Microsoft Team, and Webex Team – it's important to consider the security risks these systems may bring. That's because the same traits that make these, and similar, applications so useful for team communications also make them vulnerable to a number of different security issues. From their flexibility for working with third-party applications, to the ease with which team members can sign in and share data, low transactional friction can easily translate to low barriers for hackers to clear. When selecting and deploying collaboration tools, an IT staff should be on the lookout for a number of first-line issues and be prepared to deal with them in system architecture, add-ons, or deployment. The key is to make sure that the benefits of collaboration outweigh the risks that can enter the enterprise alongside the software.
A running Apache ZooKeeper cluster is a key dependency for running Kafka. But when using ZooKeeper alongside Kafka, there are some important best practices to keep in mind. The number of ZooKeeper nodes should be maxed at five. One node is suitable for a dev environment, and three nodes are enough for most production Kafka clusters. While a large Kafka deployment may call for five ZooKeeper nodes to reduce latency, the load placed on nodes must be taken into consideration. With seven or more nodes synced and handling requests, the load becomes immense and performance might take a noticeable hit. Also note that recent versions of Kafka place a much lower load on Zookeeper than earlier versions, which used Zookeeper to store consumer offsets. Finally, as is true with Kafka’s hardware needs, provide ZooKeeper with the strongest network bandwidth possible. Using the best disks, storing logs separately, isolating the ZooKeeper process, and disabling swaps will also reduce latency.
Mobile malware isn’t just an opportunistic tactic for cybercriminals. Kaspersky Lab is also seeing its use as part of targeted, prolonged campaigns that can affect many victims. One of the most notable discoveries this year was Skygofree. It is one of the most advanced mobile implants that Kaspersky Lab has ever seen. It has been active since 2014, and was designed for targeted cyber-surveillance. It is spread through web pages, mimicking leading mobile network operators. This was high-end mobile malware that is very difficult to identify and block, and the developers behind Skygofree have clearly used this to their advantage: creating and evolving an implant that can spy extensively on targets without arousing suspicion. ... In recent times, rooting malware has been the biggest threat to Android users. These Trojans are difficult to detect, boast an array of capabilities, and have been very popular among cybercriminals. Once an attacker has root access, the door is open to do almost anything.
Even the CMOs that don’t have the technological background are becoming more tech savvy. Integrate CMO Vaughan said he considers himself and his colleague marketers technology investors, trying to manage a portfolio of tech to provide efficiency, effectiveness and unique capabilities for the company. “We view technology as an enabler of our strategy and an important part of advancing our marketing capabilities,” Vaughan said. “We have tried to be very disciplined about not buying tech for tech sake, which is not always easy to do today with so many options. We start with the strategy, what we are trying to accomplish and build a roadmap, including ROI and an adoption plan and model for each technology we evaluate.” Vaughan said CMOs should know what is available and at their disposal to differentiate and accelerate their strategy. “This does not mean you have to be a technology expert,” he said.
To prevent data being lost or stolen is the most obvious “table stake” for consumers. Just as important is the question of whether marketers should have it in the first place. This links clearly to the likes of GDPR in Europe where the bar has been raised for all organizations around justification of the data they hold. But if we have the right data, for the right reasons, if we keep it safe and if we can make it more transparent how we’re using that data to provide a more respectful, personalized, fairer and rewarding service to the consumer, the trust will grow. Equally, we need to trust the consumer, again by providing transparent access to the data we hold, clarity around how we use it and the ability for them to control their data. Overall, the research shows that while consumers are rightly concerned about data privacy, they are also aware that data is an essential part of today’s economy, with 57% on average, globally, agreeing or strongly agreeing. Factor in the neutrals and around two-thirds of consumers are accepting or neutral around data use in today’s data-driven, data-enabled world.
Although most of the standards in the framework aren’t necessarily new, they are “intended to be a clear articulation of what matters the most in our standards agenda, and is accompanied by a renewed commitment to their implementation,” said NHS Digital CEO Sarah Wilkinson in the framework’s foreword. Speaking at the UK Health Show on 25 September, Wilkinson said the potential for use of data in the NHS is huge, but the health service needs to get to grips with standards to reap the benefits. Most of the standards in the framework, which is currently in beta form and out for consultation, are based in international ones, however some are specialised for the NHS. This includes using the NHS number as a primary identifier – a standard which has been in place for a long time, but has had mixed results in uptake. The framework said the standard “is live now and should be adhered to in full immediately”.
Australia has introduced Open Banking rules that will force the banks to share data with trusted Third-Party Providers (TPPs) by June 2019; Mexico has introduced a Fintech Law; South Korea and Singapore have enforced rules around financial data sharing between banks and third parties; and the USA has seen several banks innovating around open financial structures, although there is no law enforcing them to do this, yet. What intrigues me about the market movements is that some large financial players are taking a lead in this space, such as Citibank and Deutsche Bank’s open API markets, whilst some are resisting the change. I have heard several reports in the UK that the large banks have made data sharing incredibly difficult for the customer, by making the permissioning process very onerous and time-consuming. Equally, the implementation of European rules under PSD2 has seen several Fintech firms cry foul, as each bank creates its own interpretation, and therefore API interface, of the law.
Running a city is always a challenging task. With Big Data, however, comes new opportunities alongside new challenges. Instead of having to rely on surveys and manually tracking how people move throughout an area, cities can instead rely on sensor-derived data, providing far greater resolution and a pool of data to draw from orders of magnitude larger than ever before available. Many of these advances may seem a bit mundane at first; developing improved traffic routes, for example, is unlikely to garner many headlines. However, these changes lead to concrete improvements, saving travelers time and improving overall quality of life. Furthermore, Big Data-derived improvements can inform city planners when deciding which direction their cities will take in the future. Before launching large and expensive projects, city managers will be able to look at information gleaned from Big Data to determine what the long-term effects will be, potential changing cities in fundamental ways.
An often-cited reason to use REST is that it’s easy to debug because its “human readable”. Not being easy to read is a tooling issue. JSON text is only human readable because there are tools that allow you to read it – otherwise it’s just bytes on a wire. Furthermore, half the time the data being sent around is either compressed or encrypted — both of which aren’t human readable. Besides, how much of this can a person “debug” by reading? If you have a service that averages a tiny 10 requests per second with a 1 kilobyte JSON that is the equivalent to 860 megabytes of data a day, or 250 copies of War and Peace every day. There is no one who can read that, so you’re just wasting money. Then, there is the case where you need to send binary data around, or you want to use a binary format instead of JSON. To do that, you must Base64 encode the data. This means that you essentially serialize the data twice — again, not an efficient way to use modern hardware.
Quote for the day:
"Managers maintain an efficient status quo while leaders attack the status quo to create something new." -- Orrin Woodward