Daily Tech Digest - April 07, 2019

Can you teach humor to an AI?


“Artificial intelligence will never get jokes like humans do,” states Kiki Hempelmann, a computational linguist who studies humor at Texas A&M University-Commerce. “In themselves, they have no need for humor. They miss completely context.” he adds. “Creative language — and humor in particular — is one of the hardest areas for computational intelligence to grasp,” Tristan Miller, computer scientist and linguist at Darmstadt University of Technology Tristan Miller, a computer scientist and linguist at Darmstadt University of Technology in Germany elaborates on the complexity for machines to process context: “Creative language — and humor in particular — is one of the hardest areas for computational intelligence to grasp,”. Miller has analyzed more than 10,000 word plays and found it quite challenging. “It’s because it relies so much on real-world knowledge — background knowledge and commonsense knowledge. A computer doesn’t have these real-world experiences to draw on. It only knows what you tell it and what it draws from.” he concludes.



Security flaws in banking apps expose data and source code


Exposed source code, sensitive data, access to backend services via APIs and more have been uncovered after a researcher downloaded various financial apps from the Google Play store and found that it took, on average, just eight and a half minutes before they were reading the code. Vulnerabilities including lack of binary protections, insecure data storage, unintended data leakage, weak encryption and more were found in banking, credit card and mobile payments apps and are detailed a report by cybersecurity company Arxan: In Plain Sight: The Vulnerability Epidemic in Financial Mobile Apps. "There's clearly a systemic issue here – it's not just one company, it's 30 companies and it's across multiple financial services verticals," Alissa Knight, cybersecurity analyst at global research and advisory firm Aite Group and the researcher behind the study, told ZDNet. The vast majority – 97 percent of the apps tested – were found to lack binary code protections, making it possible to reverse engineer or decompile the apps exposing source code to analysis and tampering.


Why blockchain (might be) coming to an IoT implementation near you

Chains of binary data.
Blockchain technology can be counter-intuitive to understand at a basic level, but it’s probably best thought of as a sort of distributed ledger keeping track of various transactions. Every “block” on the chain contains transactional records or other data to be secured against tampering, and is linked to the previous one by a cryptographic hash, which means that any tampering with the block will invalidate that connection. The nodes - which can be largely anything with a CPU in it - communicate via a decentralized, peer-to-peer network to share data and ensure the validity of the data in the chain. The system works because all the blocks have to agree with each other on the specifics of the data that they’re safeguarding, according to Nir Kshetri, a professor of management at the University of North Carolina. If someone attempts to alter a previous transaction on a given node, the rest of the data on the network pushes back. “The old record of the data is still there,” said Kshetri. That’s a powerful security technique - absent a bad actor successfully controlling all of the nodes on a given blockchain, the data protected by that blockchain can’t be falsified or otherwise fiddled with.


Researchers developed algorithms that mimic the human brain (and the results don’t suck)

Researchers developed algorithms that mimic the human brain (and the results donĂ¢€™t suck)
Krotov and Hopfield’s work maintains the simplicity of the old school studies, but represents a novel step forward in brain-emulating neural networks. TNW spoke with Krotov who told us: If we talk about real neurobiology, there are many important details of how it works: complicated biophysical mechanisms of neurotransmitter dynamics at synaptic junctions, existence of more than one type of cells, details of spiking activities of those cells, etc. In our work, we ignore most of these details. Instead, we adopt one principle that is known to exist in the biological neural networks: the idea of locality. Neurons interact with each other only in pairs. In other words, our model is not an implementation of real biology, and in fact it is very far from the real biology, but rather it is a mathematical abstraction of biology to a single mathematical concept – locality. Modern deep learning methods often rely on a training technique called backpropagation, something that simply wouldn’t work in the human brain because it relies on non-local data.


Self-Service Delivery


Self-Service Delivery is an approach that makes the tools necessary to develop and deliver applications available via self-service. It makes the actions we need to take as developers — starting, developing and shipping software — available as user-accessible tools, so that we can work at our own speed without getting blocked. By making actions automated and accessible, it's easier to standardize configurations and practices across teams. We need specific building blocks to enable Self-Service Delivery. The same principle at the heart of your favorite framework applies to delivery. If we think of delivery phases in framework terms, each phase has a default implementation, which can be overridden. For example, if the convention is that Node projects in my team are built by running npm test, then I include a test script in my project. I don't write the code that runs the script, nor tell my build tool explicitly to do so. The same is true for other phases of delivery.


Artificial intelligence can now emulate human behaviors – soon it will be dangerously good

Robot
At the moment, there are enough potential errors in these technologies to give people a chance of detecting digital fabrications. Google's Bach composer made some mistakes an expert could detect. For example, when I tried it, the program allowed me to enter parallel fifths, a music interval that Bach studiously avoided. The app also broke musical rules of counterpoint by harmonizing melodies in the wrong key. Similarly, OpenAI's text-generating program occasionally wrote phrases like "fires happening under water" that made no sense in their contexts. As developers work on their creations, these mistakes will become rarer. Effectively, AI technologies will evolve and learn. The improved performance has the potential to bring many social benefits – including better health care, as AI programs help democratize the practice of medicine. Giving researchers and companies freedom to explore, in order to seek these positive achievements from AI systems, means opening up the risk of developing more advanced ways to create deception and other social problems. Severely limiting AI research could curb that progress.


The Race For Data And The Cybersecurity Challenges This Creates

uncaptioned image
High-tech today needs to be doing the exact same thing, with an emphasis on cybersecurity problems. Rather than sending devices and apps into the connected ecosystem willy-nilly, we need to fully understand what could happen when we do. How many people could be impacted? How many companies? What are the financial losses that could be sustained? What about losses to brand/image? In other words: do we really understand the implications of what we are creating here? These questions, if well researched, should be enough to slow down time-to-market and eventually stop breaking so many things. This should be performed both at the development stage in every company and the adoption stage. Companies creating products have a responsibility to their customers to ensure safety and they can’t do that if they don’t fully take everything into account. On the other end of the spectrum, CIOs, CTOs, and anyone responsible for buying and adopting new tech in your business needs to perform the same sort of analysis. Don’t just buy tech for tech’s sake.


Serverless computing growth softens, at least for now

Plans or intentions for serverless implementations have slipped as well, the Cloud Foundry survey also shows. Currently, 36 percent report evaluating serverless, compared to 42 percent in the previous survey.  Some of this may be attributable to the statistical aberrations that occur within surveys that are conducted within months of one another -- don't be surprised if the numbers pop again in the fall survey. Diving deeper into the adoption and planned adoption numbers, the survey's authors point out that within organizations embracing serverless architecture, usage is actually proliferating. For users and evaluators, 18 percent say they are broadly deploying serverless across their entire company, double the percentage (9 percent) who said that only one year ago.  Still, it is telling that there is some degree of caution being exercised when moving to serverless architecture. What's behind the caution?


Vulnerability Management: 3 Questions That Will Help Prioritize Patching


There is usually a significant delta between intended network segmentation and access rights, and what actually exists. Credentials and connections that introduce risk get set up in a variety of ways. We call this actual connectivity the “access footprint.” Throughout the normal work day, users connect and disconnect from various systems and applications, leaving behind cached credentials and potential “live” connections. The access footprint changes constantly. Some risky conditions are fleeting; others can persist for a very long time. But even if these conditions are short-lived, an attacker situated in the right place at the right time (“right” for them, wrong for you!) has plenty to work with. A new report published by CrowdStrike underscores the importance of proactively hardening the network against lateral movement. It’s a vitally important complement to traditional vulnerability management.


The Difference Between Microservices and Web Services


Microservices architecture involves breaking down a software application into its smaller components, rather than just having one large software application. Typically, this involves splitting up a software application into smaller, distinct business capabilities. These can then talk to each other via an interface. ... So, if microservices are like mini-applications that can talk to each other, then what are web services? Well, they are also mini-applications that can talk to each other, but over a network, in a defined format. They allow one piece of software to get input from another piece of software, or provide output, over a network. This is performed via a defined interface and language, such as XML. If you’re running on a network where your software components or services won’t be co-located, or you want the option of running them in separate locations in the future then you will likely need to use web services in some form.



Quote for the day:


"No amount of learning can cure stupidity and formal education positively fortifies it." -- Stephen Vizinczey


No comments:

Post a Comment