“Using artificial intelligence or machine learning can help with the information/data overload problem. Instead of presenting security analysts with terabytes of raw data we can present them with easy-to-understand views such as behavioural profiles or virtual "video recordings" of user sessions or a prioritised view of all unusual events. A machine can really efficiently dig through tons of raw data and produce real insight from it thereby freeing up security teams to focus on what's really important for them.” This fast, accurate processing of data also affords defenders another weapon against attackers – that of finding behavioural patterns. This cuts to the second major issue facing security professionals in that attackers are constantly evolving and keeping one step ahead of defenders.
We are in the midst of a reorganization of our economy in which the platform owners are seemingly developing power that may be even more formidable than was that of the factory owners in the early industrial revolution. We prefer the term “platform economy,” or “digital platform economy,” a more neutral term that encompasses a growing number of digitally enabled activities in business, politics, and social interaction. If the industrial revolution was organized around the factory, today’s changes are organized around these digital platforms, loosely defined. Indeed, we are in the midst of a reorganization of our economy in which the platform owners are seemingly developing power that may be even more formidable than was that of the factory owners in the early industrial revolution.
Fowler says some GE employees choose to use collaboration platforms that GE owns and has certified, such as Yammer in Microsoft's Office 365 suite. Others gravitate to apps like Slack. GE's employees have access to federated apps such as Yammer and Skype for Business, but they are also free to use other collaboration tools if they adhere to what Fowler calls "guardrails," including support for single sign-on, and audit and data-sharing controls. "If somebody finds that there's another tool that works better and we can license it in a legal way, and we can run it in a secure fashion, and they don't put certain types of data in it, I'm also not going to get in the way of it."
The data footprint and storage I/O requirements of IoT and big data differ from those of the traditional data center application. First, IoT data is typically a continuous feed. Data sizes can vary from miniscule to enormous. The number of files to store can reach into the trillions. This makes it easy to quickly create large amounts of data, and, as a result, there is a constant demand for capacity growth. And that growth must scale quickly and in ways that aren't disruptive. Storage systems for an IoT project also need to scale cost-effectively so that an organization can store petabytes of data for a long time. That requires low administration costs and burdens. Most IT staff simply cannot manage a dozen storage systems from six different vendors.
The hacker tools’ release “demonstrates the key risk of the U.S. government stockpiling computer vulnerabilities for its own use: Someone else might get a hold of them and use them against us,” said Kevin Bankston, director of New America’s Open Technology Institute. “This is exactly why it should be U.S. government policy to disclose to software vendors the vulnerabilities it buys or discovers as soon as possible, so we can all better protect our own cybersecurity.” The weekend’s release prompted immediate speculation about who might be behind it. A group calling itself Shadow Brokers claimed responsibility. Some experts and former employees suspect, although without hard evidence, that Russia is involved.
Static defences do not work if a yet-unknown attack is used. Instead our systems need to adapt to new types of attack. Also keep in mind that there still is a proportion of bona fide service requests to use the service. This makes it harder to inspect the traffic and to work out a classification scheme for traffic filtering. Since not all incoming requests can be assumed to be part of the attack it is more complex to derive appropriate filtering rules. If the filters chosen are too specific they do not block the attack, and if they are made too general they may block legitimate traffic. However, as defenders of good, we seek to solve these problems through the application of analytical techniques to detect DDoS attacks. A widely diverse range of statistical methods and machine learning techniques could be used to detect abnormal changes in the resource usage that are indicative of a DDoS attack.
Computer “assistants” like Siri and Cortana are the most visible use of NLP today, but there are many other applications of NLP in use. As mentioned above, Google has poured a great deal of resources into NLP as it relates to search, allowing us to type or speak a natural question and receive a relevant answer. Google also is using NLP to create predictive text responses to emails in its Inbox email client, allowing users to choose from one of three responses and respond to an email with a single click. You may have used NLP for yourself if you have ever used the “translate” link inside Facebook to translate a foreign language into your own (with varying results) or used Google translate on Google or Bing search results. A reliable machine translation has been a goal of NLP since the 1950s, and results are improving all the time.
"The biggest myth is you have to have clean data to do analysis," said Arijit Sengupta, CEO of BeyondCore. "Nobody has clean data. This whole crazy idea that I have to clean it to analyze doesn't work. What you do is, you do a 'good enough' analysis. You take your data, despite all the dirtiness, and you analyze it. This shows where you have data quality problems. I can show you some patterns that are perfectly fine despite the data quality problems. Now, you can do focused data quality work to just improve the data to get a slightly better insight." Megan Beauchemin, director of business intelligence and analytics for InOutsource, agreed. "Often times, organizations will put these efforts on the back burner, because their data is not clean. This is not necessary. Deploying an analytic application will illuminate, visually, areas of weakness in data," she said.
To what extent programmers on your team in particular impact success or failure is hard to quantify, but clearly, software and those who make it play a critical role in grabbing the market before the competition. Coding for a startup is different from coding for an established company. The startup culture is unique and extends to every angle of the business, from finance to sales to operations to software development. Your offering must be simple and inexpensive. You must be laser focused on your customer and change your offering quickly and constantly based upon customer experience. No silos, no sacred cows. Not just any code will do, and not just any coder will do. The coder, whether one of the founders or not, must be married first to the customer, not to the code. In particular, the software mindset must:
It is typical in the Vietnamese culture for folks to want to stay in their country, be involved in IT on a local basis, and provide for their families. This is a significant difference and an important advantage for the Vietnamese outsourcing environment. Then there is the level of technical talent. Malaysia has technical competency, but does not seem to possess the same scalability as Vietnam. I often hear of organizations struggling to build out teams fast enough in Malaysia because of the quantity of staff needed to do an assignment. I believe that technical competency in Vietnam is superior to the Philippines. However, in the Philippines the English is better. This is why the Philippines are so proficient in call centers.
Quote for the day:
"Treat people as if they were what they ought to be, and you help them become what they are capable of being." -- Johann Wolfgang von Goethe