According to Kaspersky Lab, the TeamXRat attackers peform brute-force attacks against internet-connected RDP servers and then manually install the Xpan ransomware on the hacked servers. "Connecting remote desktop servers directly to the Internet is not recommended and brute forcing them is nothing new; but without the proper controls in place to prevent or at least detect and respond to compromised machines, brute force RDP attacks are still relevant and something that cybercriminals enjoy," the Kaspersky researchers said in a blog post. "Once the server is compromised, the attacker manually disables the Antivirus product installed on the server and proceeds with the infection itself."
A British company just ended a pilot in Indonesia, using blockchain to pioneer a new method of traceability when it comes to fishing, which could stop many of the troubling practises within the illegal fishing industry–including slavery. Provenance used mobile, blockchain technology and smart tagging to track fish caught by fishermen with verified social sustainability claims. Their pilot successfully tracked fish in Indonesia from January to June of 2016, and demonstrated not just another digital interface, but a solution to tracking systems and claims securely and without the need for a centralized data management system.
It is predicted that by 2021, the convergence of these smart technologies will produce a huge impact on the wealth profession, unlocking the doors of global wealth across a diverse universe of investors. However, with a fast-paced marketplace, it is important for investors to understand their customers’ needs and behaviors, and make the necessary technology changes to meet their requirements. Bob Reynolds, President and CEO of Putnam Investments, commented in the report that ‘the business moves in cycles, and some are severe.” ... As a consequence, economist Dr. Nouriel Roubini said in the report that “mediocre growth and low interest rates have become the new normal.”
Developed by The Open Group Governing Board Business Architecture Work Group, this is the first installment of a three-part standard. Combined, the three parts of the standard will explicitly address all aspects of a business architecture practice. Not only will it examine the holistic approach in modeling required, but also the way of working and thinking, as well organizing and supporting. The standard clearly defines the systemic nature of transformations, the varying interests and goals of stakeholders, and prepares for consistent communication of business priorities and needs throughout the transformation lifecycle. It addresses a real need to solve structural challenges in enterprise and organizational transformations.
As the tech world is moving towards the cloud, it’s hard to imagine it functioning without IoT. And as we indulge ourselves in our devices and pour large amounts of data in this enormous mesh called IoT, it has swelled up to gigantic proportions. Such a huge system demands an extensive amount of technology and skills in order to sustain itself. But do we have the what it takes to monitor, maintain and secure IoT? According to Nick Jones, VP Distinguished Analyst at Gartner, “A recurring theme in the IoT space is the immaturity of technologies and services and of the vendors providing them. Architecting for this immaturity and managing the risk it creates will be a key challenge for organizations exploiting the IoT. In many technology areas, lack of skills will also pose significant challenges.”
With an SDDC, there are APIs for everything, so I can enable our software deployment for our product. They can have an API through Puppet and deploy through the infrastructure, and we can set up the key metrics, so if we're seeing load increase on our conservative platform, we can automatically expand that, or I can move that up to AWS. I've got some drivers from the leadership team: [They said] 'We want to move to AWS, we want to be faster.' Okay. I would argue that an SDDC makes you incredibly fast when you look at what we need to do as a company and how we need to service dev and products team -- it's that API-driven economy. They just want to be able to fire code out and know that that code gets deployed and we're operating and monitoring it and we're ensuring that stuff is staying up.
In the PM interview, Denham was also pressed on whether the ICO is doing anything to stop data flowing now, while it probes the arrangement, but she said she thinks no data is yet flowing from UK WhatsApp users to Facebook. “We are told that data is not yet being shared — so I am hoping that there is a pause in the data-sharing, and some rethinking of the terms and the consent and what data is being shared,” she said. We’ve asked Facebook to confirm whether or not it is harvesting UK WhatsApp data at this point or not and will update this post with any response. Making a general statement about the data-sharing agreement earlier this month, Europe’s Article 29 Working Party ... asserted that: “Users should keep control of their data when Internet giants massively compile it.”
Automation is having the biggest impact on areas in which employees manage physical devices, such as network services. Most IT towers see an average 25 percent decrease in the number of resources required as a result of automation, but certain IT services experience a 50 percent headcount reduction, according to ISG. ISG found that network and voice costs are declining by 66 percent mostly due to the convergence of voice, video and data solutions built on highly standardized and virtualized capabilities, an environment ripe for leveraging automation. Service desk and end user support costs declined by 26 percent due to increased adoption of self help and remote support, the introduction of self-healing functionality, and significant automation of level one and two incidents.
Machine learning is a buzzword in the technology world right now, and for good reason: It represents a major step forward in how computers can learn. Very basically, a machine learning algorithm is given a “teaching set” of data, then asked to use that data to answer a question. For example, you might provide a computer a teaching set of photographs, some of which say, “this is a cat” and some of which say, “this is not a cat.” Then you could show the computer a series of new photos and it would begin to identify which photos were of cats. Machine learning then continues to add to its teaching set. Every photo that it identifies — correctly or incorrectly — gets added to the teaching set, and the program effectively gets “smarter” and better at completing its task over time.
A common misconception is the foreach loop in C# operates on IEnumerable. That is almost correct, but it actually operates on anything that looks like an IEnumerable. That means it must have a GetEnumerator method and that method must return an object (or struct) with Current and MoveNext methods, the latter of which returns a Boolean. This was necessary back in the .NET 1.x era when we didn’t have generics or IEnumerable<T>. If you used a non-generic IEnumerable to loop over an integer array, it would have to allocate a new object for each item in the array (an operation known as boxing). As that would be ridiculously expensive, they decided C# would look for a custom enumerator first, and if it couldn’t find one then it would fall back on IEnumerable.GetEnumerator.
Quote for the day:
"The Crystal Wind is the Storm, the Storm is Data, and the Data is Life The Players Litany" -- Daniel Keys Moran