Duplex was — and still is — very much a work in progress. Among other things, the system didn’t provide a disclosure in the early days, a fact that could potentially violate the “two-party consent” required to record phone calls and conversations in states like Connecticut, Florida, Illinois, Maryland, Massachusetts, Montana, New Hampshire, Pennsylvania, Washington and Google’s own home base of California. “The consent-to-record issues here go beyond just Duplex to the broader legal implications of machine speech,” said Gabe Rottman, director of the Technology and Press Freedom Project at the Reporters Committee for Freedom of the Press. “If the service extends to all-party consent states or globally, you could see questions pop up like whether consent is valid if you don’t know the caller is a machine. Curveballs like that are just going to multiply the more we get into the uncanny valley where automated speech can pass as human.” Going forward, the system will be confined to those states where the laws make it feasible. That also applies to interstate calls, so long as both sides are covered.
Daniel Culbertson, an economist at job posting site Indeed.com, says those younger workers are more attracted to technology jobs than older workers are. In addition, when workers under 40 go looking for a job, they tend to click on very different postings than their older counterparts do. For organizations that are looking to expand their head count in the tight labor market, attracting these younger workers can be critical for remaining competitive. That means they need to craft job postings that will appeal to the Millennials. The skills that attract attention from young job candidates can also serve as a sort of compass for where the technology industry is heading. Because technology changes so quickly, tech workers tend to look for jobs related to areas that they believe will become more important in the future. Their interests can highlight trends that are likely to remain relevant for some time. Culbertson ran an analysis of job seeker behavior on Indeed.com and come up with a list of terms that appeared most often in the job postings clicked by people under 40.
Artificial intelligence and robotics were initially thought to be a danger to be blue-collar jobs, but that is changing with white-collar workers – such as lawyers and doctors – who carry out purely quantitative analytical processes are also becoming an endangered species. Some of their methods and procedures are increasingly being replicated and replaced by software. For instance, researchers at MIT's Computer Science and Artificial Intelligence Laboratory, Massachusetts General Hospital and Harvard Medical School developed a machine learning model to better detect cancer. They trained the model on 600 existing high-risk lesions, incorporating parameters like, family history, demographics, and past biopsies. It was then tested on 335 lesions and they found it could predict the status of a lesion which 97 per cent accuracy, ultimately enabling the researchers to upgrade those lesions to cancer. Traditional mammograms uncover suspicious lesions, then test their findings with a needle biopsy. Abnormalities would undergo surgeries, usually resulting in 90 per cent to be benign, rendering the procedures unnecessary.
Amazon Macie automatically discovers and classifies data stored inside Amazon S3 buckets using machine learning technology for natural language processing, and this might very well be the future. It is clear that human error cannot be reduced to zero, so putting near-real-time automated controls in to contain the risks once such an error inevitably occurs is a good approach. Another option is to enable Amazon's Default Encryption feature, which will automatically encrypt any file placed inside a bucket. Some other available features include Amazon's permission checks and alarms and the use of access control lists. It is also critical to monitor public access and API calls. Alerts should be set and actioned to cover the dumping of large amounts of files or large files in general. A SIEM can assist in correlating the required security event data for these alerts via rules and set thresholds. Data breaches through cloud storage are a problem that will not go away. There are many reasons why this topic is still such an issue, but there are mitigation options and there have been some promising developments in this space.
Not only is AI automating jobs we don’t want to do, it’s also opening the doors to jobs we can’t do. Since AI has the ability to process an infinitely larger dataset than a human can, it can leverage that scale to identify marketing insights that would otherwise be lost. Say you want to take the next step in that content-marketing data-collection project: You not only want to catalogue all of the “video marketing” content, but to catalogue all of the content being published in your industry more broadly. Ultimately, you'll want to use this catalogue to drive market-informed content campaigns of our own. Identifying new topics emerging or types of articles that garner above-average shares can help direct new content creation to align with existing trends. A given article could have many different qualities that could lead to its success. It’s AI’s ability to tag and compare many data points that ultimately produce the marketing takeaway. AI’s strength in turning a mass of data into insight truly shines in the noisiest, highest-volume channels that a marketer hopes to master.
Best effort is a familiar scenario for most IT shops. Either the security engineer, executive or another leader has said, “We need to install some level of security.” This typically involves implementing firewalls, basic security components, and maybe some basic auditing and monitoring. The next rung up the ladder is regulatory compliance. This is often an executive-level initiative. The thought is that business needs compel the company to be compliant to PCI, HIPAA, or some other standard. One might think this would make the security architecture more robust. Unfortunately, while compliance may be necessary for auditing purposes, it does not guarantee security. The third level is essentially the defensive approach — “I’m going to make this network so secure that no one is going to break into it.” This is when all those inline and out-of-band devices are deployed. You can even create defense-in-depth strategies for prevention. For instance, if someone gets through Port 80 on the firewall, the next step is to challenge the data with DPI (deep packet inspection). There are other things you can do as well, like implement prevention, detection, and response processes.
Becoming more effective at leveraging data and analytics is forcing organizations to move beyond the world of Business Intelligence (BI) to embrace the world of predictive and prescriptive analytics. Business Intelligence is about descriptive analytics: retrospective analysis that provides a rearview mirror view on the business—reporting on what happened and what is currently happening. Predictive analytics is forward-looking analysis: providing future-looking insights on the business—predicting what is likely to happen and what one should do ... Unfortunately, with many companies with whom I talk and teach, there is an “analytics chasm” that is hindering the transition from descriptive questions to predictive analytic and prescriptive actions. This chasm is preventing organizations from fully exploiting the potential of data and analytics to power the organization’s business and operational models ... Forever in search of the technology “silver bullet” (the newest technology that magically solves the Analytics Chasm challenge), IT organizations continue to buy new technologies without a good understanding how what it takes to cross the Analytics Chasm.
A startup called Aion is developing what it calls a “token bridge” that will let holders of Ethereum-based tokens back up their assets on another blockchain—initially, one built and run by Aion—without duplicating the actual monetary supply, says Matthew Spoke, the company’s founder. The process relies on a group of computers, also called nodes, that have the ability to recognize valid transactions and write new ones to each chain, Spoke says. The nodes that form the bridge will also have a process for reaching agreement amongst themselves and deciding whether to respond to a certain transaction on one of the chains by executing a corresponding one on the other. Spoke says a big difference between the pre-internet days and the blockchain world is the money: today’s competing protocols are often backed by billions of dollars of investment. That will probably ensure that many will succeed, meaning the future will be ruled by numerous blockchains, he says, and interoperability will be key to mainstream adoption. Whatever we end up with, it probably won’t look like the internet—but it could be just as transformative.
“High frequencies, in the range of 100GHz to 1THz (terahertz),” will be used for 100Gbps 6G, the ComSenTer scientists from University of Santa Barbara say in a release. The group created the ComSenTer center, which is part of Semiconductor Research Corporation (SRC) at their school. For spectrum comparison, Verizon’s initial 5G millimeter trials (along with Qualcomm and Novatel Wireless) that are taking place now will only go as far up the spectrum as 39GHz. “Our center is simply the next-, next-generation of communication and sensing,” says Ali Niknejad, ComSenTer associate director and a UC Berkeley professor, on SRC’s website. It’s “something that may become ‘6G.’” “Extreme densification of communications systems, enabling hundreds and even thousands of simultaneous wireless connections” will be part of it, the researchers claim, “with 10 to 1,000 times higher capacity than the nearer-term 5G systems and network.” Medical imaging, augmented reality and sensing for the Internet of Things (IoT) are some of the applications the scientists say will be enhanced by faster-than-5G radios.
"People are realizing that the data they have has some value, either for internal purposes or selling to a data partner, and that is leading to more awareness of how they can share data anonymously," Mike Flannagan of SAP told InformationWeek in an interview earlier this year. He said that different companies are at different levels of maturity in terms of how they think about their data. Even if you share data that has been anonymized in order to train an algorithm, the question remains whether you are giving away your competitive edge when you share your anonymized data assets. Organizations need to be careful. "Data is extremely valuable," said Ali Ghodsi, co-founder and CEO of Databricks (the big data platform with its origins offering hosted Spark) and an adjunct professor at the University of California, Berkeley. In Ghodsi's experience, organizations don't want to share their data, but they are willing to sell access to it. For instance, organizations might sell limited access to particular data sets for a finite period of time. Data aggregators are companies that will create data sets to sell by scraping the web, Ghodsi said.
Quote for the day:
"A leader must have the courage to act against an expert's advice." -- James Callaghan