The most pronounced difference between the two speakers is in their respective digital assistants: Amazon Alexa and Google Assistant. Note that both are constantly evolving and adding new features and capabilities, so any comparison is based on a snapshot in time. That said, tests of both products by TechHive and other tech publications all generally agree: Amazon Alexa excels as a tool for ordering stuff, while Google Assistant wins out when it comes to general search and information requests. Both platforms are pretty good when it comes to controlling other smart home devices and systems, although Amazon was more aggressive early on when it came to working with third-party developers. Google, however, has come a long way on that front. So if you envisage your primary use to be adding items to your Amazon shopping cart—“Alexa, reorder coffee”—then Alexa is the way to go. If you want to use it less for shopping and more for information—“Hey Google, how long will it take for me to get to Sacramento by train?”—then Google might have the edge (Alexa responded to that query with driving time.) If you’re looking to control your other devices in your home, check to see which platform is the most compatible with what you have. More on that in a bit.
Of course, it is difficult for security professionals to pick apart the wheat from the chaff when it comes to machine learning and AI. Unfortunately, there are many vendors simply slapping on AI to their messaging, but if you scratch beneath the surface, it’s nothing more than words. This makes it harder for organisations to know if what they are being promised is true and can lead to much cynicism, perhaps a reason why so few businesses are investing in these technologies. It’s more important now than ever before that enterprises shift from the manual and into the automated world, and harness technologies that can carry out some of this heavy lifting. Regulation, such as the GDPR, almost makes this an imperative with the stipulated reporting timelines. The job of a technology team has become much harder with the increased amount of cyber threats and how rapidly they are evolving. So if there are ways to save time on other jobs surely they should be grasping them with both hands. Now it’s time for security professionals to pick up the pace.
You check your blood pressure, tire pressure, and your stock prices. But when was the last time you audited your agile? Even experienced agilists can fall into bad habits, and it’s important to catch them early. That’s why I recommend auditing your agile process every six months. It might sound daunting, but everything you need boils down to four questions you can fit on a standard 3x5 notecard. During your next retrospective, ask your team to answer each of the following questions with a five-star rating scale. Five stars means you have a superawesome process, and one star essentially means you have no process or it’s really poor. Of course, most scores will be in between, but they will help your team focus on improving the weakest points. ... Each person on the team is responsible for calling out where stories lack clarity. Poorly constructed stories result in churn and wasted time. Developers, quality engineers, and product owners must agree on definition, business value, requirements, and internal dependencies. Otherwise, you’ll find bugs, pushbacks, and failure to sign off on a completed story.
Once you gather the survey data, your company’s NPS is determined by subtracting the percentage of detractors from the percentage of promoters, while passives count towards the total number of respondents. ... It’s easy enough to calculate your organization’s NPS manually, but if you want to outsource the process, there are third-party services that will help you send out surveys and determine your score. ... A good net promoter score is technically anything above zero, which means you have more promoters than detractors. The worst score you can get is a -100, which means you do not have a single promoter and that all your customers are detractors – vice versa for a score of 100. A score of 50 or more is considered excellent. ... The result of NPS is a straightforward metric that companies can use to gauge customer loyalty and the health of the company’s brand. It’s just one question, but it’s an important metric for helping businesses understand where they stand in the market and determine whether their effort is better spent on maintaining customers' satisfaction or if it’s time to try winning back unhappy customers.
With the free Teams product, Microsoft is telling it's largest rivals -- Cisco and Slack -- that the company is in the market "to win it -- or at least significantly disrupt it," Kurtzman said. However, the competitors have advantages. Slack has more than 1,500 third-party app integrations, and Cisco's Webex Teams is a video-centric collaboration platform that works well with Cisco's networking hardware and software. Microsoft is preparing for battle by simplifying its collaboration portfolio. The company has said it will replace Skype for Business Online with Teams, a move that raised concerns that Teams won't have the same telephony tools. Microsoft has tried to ease customer anxiety by rolling out Teams calling features, such as call delegation and direct routing. Call delegation lets a user receive someone else's call -- a necessary feature within enterprises. Direct routing enables companies to use their existing telephony infrastructure with Teams. However, accessing that function requires a company to have Teams and Phone System -- formerly called Cloud PBX -- as part of an Office 365 subscription.
SHA-2 is the cryptographic hashing standard that all software and hardware should be using now, at least for the next few years. SHA-2 is often called the SHA-2 family of hashes because it contains many different-size hashes, including 224-, 256-, 384-, and 512-bit digests. When someone says they are using the SHA-2 hash, you don’t know which bit length they are using, but the most popular one is 256 bits (by a large margin). Although SHA-2 shares some of the same math characteristics as SHA-1 and minor weaknesses have been discovered, in crypto-speak it's still considered "strong” for the foreseeable future. Without question, it's way better than SHA-1, and any critical SHA-1 enabled certificates, applications, and hardware devices using SHA-1 should be moved to SHA-2. All major web browser vendors (e.g. Microsoft, Google, Mozilla, Apple) and other relying parties have requested (and have been doing so for years) that all customers, services and products currently using SHA-1 move to SHA-2, although what has to be moved by when is different depending on the vendor.
Photons, as used in the quantum-key distribution work, will likely end up securing future networks and could turn out to be a crucial element to upcoming quantum computing overall. The particles of light are good for moving qubits (quantum information carriers) because they can travel distances and work with fabricated chips, explained the University of Maryland in a news article announcing what it said is a breakthrough in photon-carried quantum computing. The school said it has invented the first single-photon transistor from a semiconductor chip — a photon transistor, in other words. Traditional transistors are the miniscule routing switches used in every form of computing. Producing a photon-based one, where the switches interact with each other, could “attain exponential speedup for certain computational problems.” Photons don’t natively interact — a prior downside. “Roughly 1 million of these new transistors could fit inside a single grain of salt. It is also fast and able to process 10 billion photonic qubits every second,” the school said. “Quantum communications technologies are starting to play a significant role in securing our data and communications," said Dr. Grégoire Ribordy
Using the in-memory database and application platform SAP HANA, SAP has developed a prototype that helps organizations analyze geospatial data and predict how storms can impact a given region. After years of collaboration with Esri, a leader in geographical information systems, the two companies announced tighter integration between SAP HANA and Esri’s “geodatabase” in January. This allows customers to analyze geographic information within their business processes and take action more easily. Previously, customers had to analyze location data separately from business applications, then combine them. As Hasso Plattner, co-founder of SAP and chairman of the Supervisory Board of SAP SE, pointed out at SAPPHIRE NOW, SAP just took spatial capabilities one step further and released them as services that can pull weather or satellite data directly from providers into the enterprise data layer. Customers can now create location-aware application more quickly using this functionality, part of the recently-announced SAP HANA Data Management Suite.
The EU’s General Data Protection Regulation, like its predecessor the Data Protection Directive, authorizes the export of EU citizens’ personal information only to jurisdictions that provide an adequate level of privacy protection. Privacy Shield, an agreement signed by EU and U.S. officials in 2016, seeks to reconcile the different levels of legal protection afforded on each side of the Atlantic, allowing businesses to export EU citizens’ data to the U.S. for processing. The EU’s executive body, the European Commission, ruled in 2016 that the Privacy Shield deal provided adequate protection for personal information, but called for it to be reviewed annually. It’s with an eye on the next review of the agreement, in September, that Members of the European Parliament called for the deal to be suspended in a vote on July 5. The Parliament’s resolution on Privacy Shield identified several areas in which U.S. authorities had not yet met their commitments under the agreement, despite having been given a deadline of May 25, 2018. The U.S. Senate has still not ratified the appointment of three members of the Privacy and Civil Liberties Oversight Board (PCLOB), including its chairman.
Many cloud backup and storage solutions have appeal because they offer cloud storage and data access and restore from anywhere. However, such solutions don’t offer capabilities that allow users to totally recover applications, servers, and entire business operations in a tight timeframe. Because of this, companies require IT resilience that is affordable and effective. This means that solutions must offer automated, seamless access to your data and applications. But what do solutions need, specifically, to achieve resilience? A few key elements of technology must be present for IT Resilience and Assurance (ITRA) to be achieved. These components are anomaly detection, backup, deduplicated file system assisted replication, orchestration, and assurance. That’s a lot to take in one sentence, so let’s break them down! ... Anomaly detection is a feature that enables users to predictively detect a risk to their systems. This capability allows users to receive an early warning if activity happening with their data could be related to a ransomware or other kind of malware attack. Signs that a ransomware attack is occurring include affected files being renamed, causing them to appear to be new files when backed up.
Quote for the day:
"No one really succeeds everyday but successful people do something everyday to help themselves succeed." -- @LeadToday