September 01, 2016

Technical Practices as a Hack on Consciousness: Why to Hack Yourself

The emergence of integral consciousness is a result of rapidly increased access to many different perspectives and many people of different backgrounds coming together in postmodern society. As a result individuals then can begin to realize the vast range of choice in who they are and the burden of self-authorship. In other words, they realize that both their mental models of who they are and how the world is are not necessarily as immutable as they appear. The principle of “mutual causality” as described by Buddhist scholar and environmental activist Joanna Macy means the doer and the deed are co-arising and create each other. We are created by our actions. We co-arise through our work and our interactions with others. When teams form and perform well, they likely have taken advantage of mutual causality.


3 new programming languages: What their creators say

It’s one matter to create a language; it’s another to ensure it’s of use to others who may be bumping up against similar problems. Each of the developers of these three emerging languages see use cases already evolving. Oden, Wickström says, is good at tasks within Go’s wheelhouse: web servers, back-end services, and command-line tools, to name a few. It is also good for building libraries that provide generic user-defined data structures, generic algorithms, and control-flow abstractions, he says. Crystal, meanwhile, is suited for web services, because it has nonblocking I/O and lightweight processes, Borenszweig says. Existing applications have included command-line applications, emulators, websites, and IRC bots. “It can also be used in competition programs, where you need to prototype fast but achieve good performance,” Borenszweig says. “Finally, it can be used to build compilers, such as Crystal.”


10 Essential Resources for Evaluating Backup and Disaster Recovery Solutions

When it comes to the selection of a solution for a new backup and recovery initiative, where should you start? A typical internet search may yield some worthwhile answers, though you’ll likely be bombarded with sponsored links to so-called ‘expert’ analysts pushing their own agendas. Consulting trustworthy sources of information is the name of the game. At Solutions Review, we put ourselves in the middle of it all, searching for resources that can assist buyers of enterprise technology to achieve their goals in selecting the tools that best fit their needs. With this in mind, we’ve compiled a list of the 10 best resources solutions-seekers should consult while in the research phase of a new backup and recovery solution project.


SQL engines boost Hadoop query processing for big data users

Deploying a SQL-on-Hadoop query engine might require more processing horsepower in a Hadoop cluster, as well. For example, Premier added five compute nodes to its cluster when it moved the ETL processing for physician performance data to the Hadoop system, increasing the total number of nodes to 19. And Palmer said he expects the cluster to grow further as the company puts more applications on it. But he added that the SQL-driven expansion of the cluster, which currently holds about 65 TB of data, has saved money overall by enabling Premier to completely replace one data warehouse appliance system and reduce its use of another one -- both being higher-cost processing platforms than Hadoop. The new Hadoop math is simple, according to Palmer: "I like to say that the more we expand the cluster, the more money we save."


Notebook Workflows: The Easiest Way to Implement Apache Spark Pipelines

Databricks Notebook Workflows are a set of APIs to chain together Notebooks and run them in the Job Scheduler. Users create their workflows directly inside notebooks, using the control structures of the source programming language (Python, Scala, or R). For example, you can use if statements to check the status of a workflow step, use loops to repeat work, or even take decisions based on the value returned by a step. This approach is much simpler than external workflow tools such as Apache Airflow, Oozie, Pinball, or Luigi because users can transition from exploration to production in the same environment instead of operating another system. Notebook Workflows are supervised by the Databricks Jobs Scheduler. This means that every workflow gets the production functionality provided by Jobs, such as fault recovery and timeout mechanisms.


Keezel's wireless device protects hotel Wi-Fi , home IoT connections

The latest use is as a security layer for connected home appliances. Smart lightbulbs often contain dumb security flaws, and many of them are never patched. Rather than leave your entire home network at risk of some hacker finding your lightbulb and using it as a staging post for an attack, you could to use Keezel to isolate it from other devices on your home network, co-founder Friso Schmid said.  Future software updates to Keezel could also block ads or prevent access to malicious websites, Muller and Schmid suggested. Keezel recently upgraded its design with a new Wi-Fi chip that covers the 5 GHz band, so it now supports 802.11 b, g, and n. There's no point in adding 802.11ac, Muller said, because the dual-core ARM chip at the heart of the device wouldn't be able to keep up.


Why Agile Is Critical for Attracting Millennial Engineers

A Millennial-generation developer who works on JIRA Software at Atlassian told me, "I chose to work in an Agile environment because our industry moves fast, and I don't want to sink time and energy into something that doesn't matter, or won't help me move forward." Agile development, he said, "goes beyond the technologies we use or the processes we follow. It's an attitude, and it needs to be present throughout the entire organization." For some Millennials, working with less-than-pure Agile methods is a compromise. A 27-year-old project manager told me he left a Big 5 Consulting Firm after nearly 5 years because the architecture-driven version of Agile practiced there was "too constrained by governance." He made a lateral move to a job with a software developer focused on small, private customers.


Creating the next wave of data center innovations

The biggest opportunity for the data center industry lies in the effective use of the vast quantities of data unleashed by IoT, combined with big data analytics as a catalyst for the next wave of innovation. As an industry, we must not only be able to instrument a data center and provide data to the data center operators, we must also be able to consolidate millions of data points every minute from multiple data centers and use sophisticated data mining tools to spot trends. ... If the industry can come together to achieve that, we will be able to perform true benchmarking with universal metrics across the entire industry, and with that, we are looking at the possibility of true machine learning – not just machine learning on one data center in one location, but machine learning throughout the entire data center ecosystem – where not only does one company benefits but the ecosystem as a whole benefits.


Encryption hiding malware in half of cyber attacks

The trend is expected to grow in parallel with the greater legitimate use of encryption. Inbound encrypted traffic is expected to rise from 39% to 45% next year, and outbound encrypted traffic from 33% to 41%. When asked about malware hiding outbound data within encrypted traffic, 74% said this was highly likely but only 16% thought their organisation could identify and mitigate SSL-encrypted malware attack before data exfiltration. When asked if traffic from an SSL-secured malware server could be spotted by their intrusion prevention system (IPS), 79% of respondents said it is highly likely this could occur in their organisation; only 17% thought their organisation has the ability to mitigate such an attack.


Detection & Response: Where To Begin

As with any technology, though, machine learning is another one of the many layers in the entire security infrastructure. "It’s an additional layer on top of a SIEM that augments and helps to tune the system," Poulin said. CISOs struggle with determining which of those layers are most important, and when there are so many layers that the technology becomes redundant or inharmonious. Poulin said, "Perspective is everything. My personal philosophy is borrowed from a wood-working expression, 'measure twice cut once'. You need to have something to measure the information." All they have to do to weed through the overgrowth is determine where the problem is for them--at the perimeter, user role management, data access? The problem for many who are feeling so overwhelmed and understaffed is that looming question,Where do I begin?



Quote for the day:


"Program testing can be used to show the presence of bugs, but never to show their absence!" -- Edsger W. Dijkstra


No comments:

Post a Comment