Design succeeds when it finds ideal solutions based on the real needs of real people. In a recent Harvard Business Review article on the evolution of design thinking, Jon Kolko noted, "People need their interactions with technologies and other complex systems to be simple, intuitive, and pleasurable. Design is empathic, and thus implicitly drives a more thoughtful, human approach to business." When done well, human centered design enhances the user experience at every touch point, and fuels the creation of products and services that deeply resonate with customers. Human centered design is foundational to the success of companies like SAP, Warby Parker, and AirBnB. ... To delight employees, Cisco has identified "moments that matter" -- such as joining the organization, changing jobs, and managing family emergencies -- and redesigned its employee services around these moments.
In addition, the majority of software developers and system administrators are not accustomed to working in an environment containing federally regulated information such as ePHI, Copolitco wrote. Security controls may chafe developers as they have to adjust how they do things. “All companies who have a compliance obligation must remember that the point of HIPAA compliance is to impose a certain level of security, said Reed. “Security is the ultimate goal, not necessarily compliance. Compliance comes as a result of having a good security program. Being compliant does not mean you are secure; it merely means you have 'checked the boxes.'” An HHS Office for Civil Rights official stated at the recent HIMSS and Healthcare IT News Privacy & Security Forum in Boston that the organization will be conducting on-site audits of hospitals in 2017 and that OCR is engaged in over 200 audits at the moment.
No matter what I have done throughout my career "data" has always played a very important part. Over the years, I've recognized how data has transformed the business and engineering part of development. Machine learning is no longer limited to large enterprises, and smaller companies are ready to get involved and take advantage of its benefits. Also, with the proven results from deep neural networks in various fields, it is clear that this is the time when machine learning and deep neural networks will play a very important role in technology going forward. I suppose my interests in data science are very well timed for the rise of machine learning. It is certain that technology changes everything time and time again, and for every programmer, self-transformation is an important step to keep relevant and competent in an ever-changing field.
I have found the ‘N+1 Syndrome’ to be the most common reason, especially among accomplished professionals who are doing well in their current gig. The thinking goes like this: You are earning well, you have a good name at work, your families are comfortable, and most importantly, you get that nice paycheck at the end of the month! Yes, you have this exciting idea, the thought of not reporting to a stupid boss is enticing, the lure of hitting that IPO jackpot, becoming famous, and retiring by the time you are 40 is tantalizing! You are going to do it, yes! No-one is better qualified! You will just get this one little thing out of the way, and then you are set! Most even have excellent ideas for the new business, but somehow they keep moving the start date forward by a year, then another year, and another.
For context, CPUs, or central processing units, are the processors that have been at the heart of most computers since the 1960s. But they are not well-suited to the incredibly high computational requirements of modern machine learning approaches, in particular deep learning. In the late 2000s, researchers discovered that graphics cards were better suited for the highly parallel nature of these tasks, and GPUs, or graphics processing units, became the de facto technology for implementing neural networks. But as Google’s use of machine learning continued to expand, they wanted something custom built for their needs. “The need for TPUs really emerged about six years ago, when we started using computationally expensive deep learning models in more and more places throughout our products.
To understand why APIs inherently lack security, you must understand that API exploits attempt to compromise the application in one of two ways. The first is through application programming errors that attempt to reveal data or impair the operation of the application. These exploits manifest themselves through malicious inputs like SQL injection, cross-site scripting, and other such attempts at exposing data. Generally, applications can be secured against programming errors. This is often an iterative approach that can take months to years of use, testing, patching and retesting, but it can be done. The second avenue is through attempts to exploit the business logic of the application to create unauthorized access or fraudulent transactions. The harder portion to identify and stop are the exploits of business logic. Applications are being designed to deliver micro-services which expose a large number of interfaces to the Internet.
Even if the backup looks promising, there is no easy button. The people creating ransomware know that backups can stand between them and their payday. There are a lot of cases where Microsoft Volume Shadow Copies have been destroyed by ransomware. If you leave your backups online so you can have quick recovery, you may find that ransomware can actually delete or corrupt your backups. This is not uncommon; ead the user groups from various backup companies and you’ll see the sad tales of woes. If you are not concerned enough, there are other potential dangers to your backups. They need to be airlocked from systems your users have access to. Before you bring your backups online, make sure the affected computers are off of the network. You need to be absolutely certain that those systems can’t access the backup.
TMDs are compounds composed of a transition metal such as molybdenum or tungsten and a chalcogen (typically sulfur, selenium or tellurium, although oxygen is also a chalcogen). Like graphene, they form into layers. But unlike graphene, which conducts electricity like a metal, they are semiconductors, which is great news for flexible chip designers. Stefan Wachter, Dmitry Polyushkin and Thomas Mueller of the Institute of Photonics, working with Ole Bethge of the Institute of Solid State Electronics in Vienna, decided to use molybdenum disulfide to build their microprocessor. They deposited two molecule-thick layers of it on a silicon substrate, etched with their circuit design and separated by a layer of aluminium oxide. "The substrate fulfills no other function than acting as a carrier medium and could thus be replaced by glass or any other material, including flexible substrates," they wrote.
DevOps implementations also vary from company to company. At business law firm Benesch, Friedlander, Coplan & Aronoff LLP, "I think the real focus is on agile communication and client outcomes, versus delivery," said CIO Jerry Justice. "[It's about] creating a solid feedback loop so you can adjust targets and timings." However, not all companies are ready to fully jump on board the new workflow. While Simon Johns, IT director at Sheppard Robson Architects LLP, said the firm has yet to implement DevOps, he also said that "there are elements of the 'philosophy' I would like to introduce into our workflows—build fast, fail fast type of situations." David Wilson, director of IT services at VectorCSP, said he doesn't plan to implement the workflow. "After nearly 30 years of IT experience, I doubt any of those large software companies are really investing in this," Wilson said.
While some network ports make good entry points for attackers, others make good escape routes. TCP/UDP port 53 for DNS offers an exit strategy. Once criminal hackers inside the network have their prize, all they need to do to get it out the door is use readily available software that turns data into DNS traffic. “DNS is rarely monitored and even more rarely filtered,” says Norby. Once the attackers safely escort the data beyond the enterprise, they simply send it through their DNS server, which they have uniquely designed to translate it back into its original form. The more commonly used a port is, the easier it can be to sneak attacks in with all the other packets. TCP port 80 for HTTP supports the web traffic that web browsers receive. According to Norby, attacks on web clients that travel over port 80 include SQL injections, cross-site request forgeries, cross-site scripting, and buffer overruns.
Quote for the day:
Leadership: "If you are not building for the long term you are doing the wrong thing." --@Bill_George