Daily Tech Digest - September 25, 2019

Digital twins – rise of the digital twin in Industrial IoT and Industry 4.0

The rise of the digital twin in the Internet of Things
Digital twins offer numerous benefits on which we’ll elaborate later. In fact, you might already have seen the concept in action. If you didn’t, the video below, using a bike equipped with sensors, gives you a good idea. However, in real life you’ll notice that digital twins today are predominantly used in the Industrial Internet or Industrial Internet of Things and certainly engineering and manufacturing. If you remember our airplane engine or other complex and technology-intensive physical assets such as IoT-enabled industrial robots and far more, you can imagine why. You can even create a digital twin of a an environment with a set of physical assets, as long as you get those data. ... In the future we’ll see twins expand to more applications, use cases and industries and get combined with more technologies such as speech capabilities, augmented reality for an immersive experience, AI capabilities, more technologies enabling us to look inside the digital twin removing the need to go and check the ‘real’ thing and so on.



AI will be the biggest disruptor in our lifetime: Amitabh Kant, CEO, NITI Aayog

India is among the very few countries globally where the government has driven digitization in a big way. For instance, almost 99.3 percent of Indians pay their Income Tax online. Almost 96 percent of these filings are cleared within three months because they are digital. The new Goods & Service Tax (GST), is digital – cashless and paperless. The Ayushman Bharat scheme is portable, paperless and digital. It provides health insurance to 500 million Indians. The number of beneficiaries is greater than the population of the USA, Europe, and Mexico put together. Every single rupee released through the Public Finance Management System (PFMS) is tracked to the last point digitally. By integrating technology into various aspects of the economy, the government has generated vast volumes of datasets. It is important that we use this data along with computing power and new algorithms to drive huge disruption. That’s the only way we can radically leapfrog and catch up with advanced economies.


European enterprises 'waste' £24,000 a day on unused cloud services, says Insight research


Given the foundational role that cloud is increasingly playing within enterprise digital transformation strategies, these are important areas to address and get right, the report continues, as organisations set about making better use of their data through the deployment of analytics, machine learning and artificial intelligence tools. Indeed, 46% of respondents flagged AI, big data, machine learning and deep learning tools as being “critical” to their digital transformation initiatives over the past two years. “When analysed, shared, and leveraged intelligently, [data] can facilitate more informed decision-making, improve the quality of offerings, and enhance the customer experience,” the report said. “IT professionals express confidence in AI, big data and machine learning because these technologies enable organisations to transform data into business intelligence.” But this confidence could prove to be misplaced unless organisations have a robust cloud strategy in place to underpin their plans, the report added.


There is a real demand for AI in healthcare, but preserving privacy is key

There is a real demand for AI in healthcare, but preserving privacy is key image
The introduction of AI into healthcare is important for several reasons. The main one, though? Scale. “With the NHS potentially losing up to 350,000 staff by 2030, using AI will be the only way to scale services to match the mounting demand that is hitting the UK with a shrinking workforce,” explains Lorica. The impact of AI in healthcare won’t only be felt in the NHS. Instead, the technology  will have a wide range of applications in everything from personal medicine to research, diagnosis and logistics. But, despite a clear desire to integrate AI, it must be done correctly. And before it can effectively disrupt the sector, Lorica suggests that “various organisational and cultural changes need to be implemented. ... Before AI can truly transform the healthcare sector, the elephant in the room needs to be addressed: patient confidentiality or privacy. “The NHS holds personal information about almost every person living in the country, which means preserving privacy, collecting and cleaning data and data sharing is paramount,” explains Lorica.


The Interesting Case of Who’s Using the IT4IT™ Standard – Part Two


Digitalization is driving the proliferation of Cloud and Mobility and is causing IT organizations to rethink their IT Operating Model to support both the digital workforce and new service delivery models. To exploit the rapid pace of disruptive IT innovation, HCL Technologies chose to adopt The Open Group IT4IT Reference Architecture to design and develop its XaaS-based (Everything as a Service) product and service offering. This meant that HCL Global IT needed to better understand the business requirements of IT to allow it to achieve the agility and velocity the business and end users required of its services. To achieve this, HCL Global IT required a unified and sophisticated IT Operating Model to support the business in their Digital Transformation journey. Therefore, HCL aligned its product and services to the IT4IT Value Stream-based IT Reference Architecture and developed a product and platform named XaaS Service Management (XSM), which has the ability and capability to address the customer-specific issues and challenges.


BigTech is coming. Is banking ready?

Many banks are responding to the competition from BigTechs (and fintechs) by learning from and co-creating with them to strengthen their client propositions. They are also investing heavily to support new partnerships, acquisitions, and the development of in-house solutions. Upholding this, a recent Bloomberg report that ranked banks by technology spending so far in 2019 showed the top five had invested a combined USD44 billion. Corporates have also been responding to the ‘uberisation’ of commerce following BigTech’s move from online consumer models further into the B2B arena. As well as re-engineering their physical and financial supply chains, corporates are now also rethinking their relationships with transaction banks. For example, as manufacturers try to replicate BigTech’s speed, they are considering decentralised production. Having multiple yet smaller assembly locations puts companies closer to the end-consumer. It also creates a more conducive environment to react to changing local demand and offer more customisation.


How Artificial Intelligence and IoT are transforming real estate

PW-190925_cityscape_yardi_web_shutterstock_385815319-1569325404283
The trend for workplace flexibility also provides an incentive for rental platforms and Space as a Service. The increase in smaller companies (self-employed persons) is resulting in increased demand for flexible and on-demand workplaces. Many corporates are trying to cut on overheads, opting for shared workspaces. As part of this we have developed a software product called Yardi Kube that centres can use to manage members’ space allocations. Yardi Kube folds in a technology management system for shared workspaces, providing IP addresses, Wi-Fi and telephones that are crucial for this sector. Yardi’s coworking module will be released in 2020 across the Middle East. The platform will provide the most comprehensive coworking software on the market as it combines financial, workspace and technology management all in a centralised database. IoT is a technology where systems such as plumbing, electrical outlets, thermostats and lighting are connected and perform smart functions via the internet. From convenient property showing and increased energy efficiency to predictive maintenance, IoT applications are making it easier for people to buy, sell and own rental properties. Smart homes with IoT capabilities usually have a higher market value than those that don’t.


Can Oracle substantiate its cloud bluster?


Commenting on the competition in the cloud, Ellison said the cloud databases were open source-based and a lot more specialised. But he added: “None of them are autonomous. None of them are secure. None of them give you 99.995% availability. I mean, they’re – we’re 100 times more reliable.” But Oracle does face a challenge from these open source competitors. The competition is not coming directly from its previous enterprise customers and CIOs in these organisations. Instead, it is being driven from the bottom up by software developers choosing products they consider more exciting and, arguably, technically superior for the applications that use them, compared to Oracle. A recent Stack Overflow survey of 6,000 developers in the UK and Ireland reported that Oracle was not among the top three database servers being used: MySQL was used by 44.9% of respondents, Microsoft SQL Server by 40.7% and PostgreSQL by 31.4%.


Testing Microservice: Examining the Tradeoffs of Twelve Techniques - Part 2

Most projects need a combination of testing techniques, including test doubles, to reach sufficient test coverage and stable test suites — you can read more about this so-called testing pyramid. You are faster to market with test doubles in place because you test less than you otherwise would have. Costs can grow with complexity. Because you do not need much additional infrastructure or test-doubles knowledge, this technique doesn’t cost much to start with. Costs can grow, however — for example, as you require more test infrastructure to host groups of related microservices that you must test together. Testing a test instance of a dependency reduces the chance of introducing issues in test doubles. Follow the test pyramid to produce a sound development and testing strategy or you risk ending up with big E2E test suites that are costly to maintain and slow to run. Use this technique with caution only after careful consideration of the test pyramid and do not fall into the trap of the inverted testing pyramid.


Google Wins 'Right to Be Forgotten' Case

Google Wins 'Right to Be Forgotten' Case
Google commented on the ruling in a statement: "Since 2014, we've worked hard to implement the right to be forgotten in Europe, and to strike a sensible balance between people's rights of access to information and privacy. It's good to see that the court agreed with our arguments."Europe's General Data Protection Regulation, which went into full effect last year, has a separate "right to be forgotten" provision with much broader requirements. ... While ruling that Google does not have to extend the right to be forgotten for European citizens outside of Europe, the court acknowledged that in today's globalized world, information can harm a person's reputation. But it said different countries have different approaches to the right to be forgotten, and hence a universal law cannot be applied. "A global de-referencing would meet the objective of protection referred to in EU law in full. ... Numerous third states do not recognize the right to de-referencing or have a different approach to that right," the court said.




Quote for the day:


"Great leaders go forward without stopping, remain firm without tiring and remain enthusiastic while growing" -- Reed Markham


Daily Tech Digest - September 24, 2019

Two AMD Epyc processors crush four Intel Xeons in tests

Two AMD Epyc processors crush four Intel Xeons in tests
Tests by the evaluation and testing site ServeTheHome found a server with two AMD Epyc processors can outperform a four-socket Intel system that costs considerably more. If you don’t read ServeTheHome, you should. It’s cut from the same cloth as Tom’s Hardware Guide and AnandTech but with a focus on server hardware, mostly the low end but they throw in some enterprise stuff, as well. ServeTheHome ran tests comparing the AMD Epyc 7742, which has 64 cores and 128 threads, and the Intel Xeon Platinum 8180M with its 28 cores and 56 threads. The dollars, though, show a real difference. Each Epyc 7742 costs $6,950, while each Xeon Platinum 8180M goes for $13,011. So, two Epyc 7742 processors cost you $13,900, and four Xeon Platinum 8180M processors cost $52,044, four times as much as the AMD chips. And that’s just the chips. The actual servers will also set you back a pretty penny, especially since four-socket servers cost much more than two-socket servers regardless of the processor you use.



Build cloud economics expertise in-house


Organizations think the cloud is more expensive than an on-premises data center because they don't capture costs effectively throughout the entire process. That starts during the architecture and design phases, where expenditures often go unnoticed. Put the tools in place to communicate -- at a high level -- all features and services that power your applications and environments. For example, it could take the form of a reference architecture that lays out all the services you have in production. Enterprises are better positioned to spot potential money pits when there's cost transparency, and managers, IT staff and back-office staff understand their roles in keeping costs in check. As you put together plans for your cloud initiative, remember to factor in the costs of such training. Organizations deploy a variety of cloud storage techniques, such as archival, reduced redundancy and backup -- each of which carries its own impacts on performance and cost.


Putting blockchain technology to good use


Richard Hunt, founder of Turnkey Consulting, believes that once an individual has been through the process to prove their identity, this proof can be reused in other situations where ID is required. “A digital identity would enable citizens to take back control of their data and their identity, choosing who to share this information with and, perhaps more importantly, who not to,” he says. “It would also allow individuals to both fully understand and capitalise on the value of their personal data.” Gartner distinguished vice-president David Furlonger says governments are looking at ways blockchain can be deployed to improve efficiency. Efficiency-based initiatives are founded on the idea that decentralised, multiparty transactions can be streamlined using blockchain to solve transactions. Government interests are mostly driven by their need to decrease friction in disconnected processes, interactions or transactions between a variety of government organisations or involving the broader public/private ecosystems.


Microsoft delivers emergency security update for antiquated IE

secure encrypted internet web browser address bar
IE was demoted to second-citizen status with the introduction of Windows 10, but Microsoft has been adamant that it will continue to support the browser. IE, particularly IE11, remains necessary in many enterprises and organizations for running aged web apps and internal websites. The browser may retreat to a "mode" within a vastly reworked Microsoft Edge - and the stand-alone abandoned - but IE will live on in some form. Still, it's no longer the most popular kid on the block: According to the latest data from web analytics vendor Net Applications, IE accounted for just 9% of all Windows-based browsing activity. For comparison, Edge's share of all Windows was around 7%. According to information in the description of the update package, the emergency IE fix is available only through the Microsoft Update Catalog. Users would have to steer a browser to that website, then download and install the update. The easiest way to locate the IE update is by using the link in the OS-appropriate KB (for knowledge base) gleaned from the security bulletin. (No one said Microsoft makes it easy.)


Managers Lack Confidence To Develop Skills Employees Need Today: Gartner

Managers lack confidence to develop skills employees need today: Gartner - CIO&Leader
“Today’s organizations are undergoing a digital transformation that directly impacts how they do business, and they are finding a significant skills gap within their workforces,” said Jaime Roca, senior vice president in the Gartner HR practice. “Our research found that 70% of employees have not mastered the skills they need for their jobs today, let alone the skills needed for their future roles.” Organizations that are most successful at developing their employees have focused on cultivating Connector managers, who are able to connect employees to the right people and resources at the right time. In fact, Connector managers boost employee performance by up to 26% and more than triple the likelihood that their employees will be high performers. “Connector managers give targeted coaching and feedback in their areas of expertise, but they recognize that there are skills best taught by people other than themselves,” said Sari Wilde, managing vice president in the Gartner HR practice.


Contemporary Front-end Architectures


Web applications have evolved from simple static websites (two-tiered architecture) into complex multi-layered SPA and SSR driven API first systems. CMS systems have grown into Headless content-first systems. Front-end community has changed rapidly in recent times. It started with DOM infused algorithms introduced by jQuery, which was quickly succeeded by MVC based Backbone.js. And, in no time, we found ourselves in the jungle of bidirectional and unidirectional data flow architecture. Somewhere, we lost the track of how we got here. How did the world that was so drenched in MVC suddenly got into React pioneered unidirectional data flow? What is the correlation? As we progress, we will attempt to unlock this puzzle. Though aimed at front-end engineers, the article should help any web developer seeking a general understanding of modern web application architecture. Software Architecture underpins large number of activities — process, requirement gathering, deployment topology, technology stack, etc. However, that is outside the scope of this article.


Top 5 nontechnical skills for cloud computing success


Legal implications in cloud computing aren't just about compliance and regulation like HIPAA or GDPR. Rules on how to handle data are important, but there's also value in hiring employees who can provide general legal advice around cloud computing. Questions can arise outside the purview of compliance, such as what the tax regulations are around cloud providers. For instance, in some situations, it does not make sense to displace existing hardware with cloud services. If that hardware hasn't fully deprecated, the associated tax benefit has not been fully realized. This can lead to a net loss that's much higher than any savings that might come from switching to the cloud. Other legal issues include how software licenses are transferable and adherences to service-level agreements. Most people who advise on cloud law are lawyers, although some are converted project managers.


Why CIOs should take extra precautions when buying IT support


“It is important to carefully evaluate primary support capabilities, such as the breadth and depth of the support team in each global region, the comprehensiveness of the service offering, and experience and scope in delivering vital tax, legal and regulatory updates, as well as strategic capabilities like modernisation and cloud services, hybrid IT, business-driven roadmap planning and application management services,” she said. “The right partner will help you to maximise the value of your existing applications and create the capacity to fund your modernisation, as well as free up resources to focus on transforming your IT systems,” Phelan added. “As such, another consideration for enterprises evaluating third-party support is how providers are investing in their proposition. The expertise and talent they bring into the organisation, and the innovative new services they offer, are key to helping companies transform.”


Six Degrees of Application Architecture

With application architecture, there is often a high-level approach or cycle that flows through the process of design and early development/prototyping. As time passes, we get better by leveraging frameworks and patterns that help reduce boilerplate or duplicate efforts in the design process. Think about it, since ORM frameworks have been introduced, few are spending time writing code that provides what Hibernate (as an example) provides. Even improvements to languages like Java cut down on the repetitive code which used to be required to process a list of objects. ... It is rare for work to be put into place to update those existing applications to take advantage of the new service. Even if the existing applications are not RESTful based, adding the functionality to make a call over HTTP is typically not that involved — especially if the value gained by the legacy application is significant. Most of the time the reason I encounter for such tasks not getting completed is tied to budgeting costs for the updates, validation/testing, and deployment.



Quote for the day:


"A leadership disposition guides you to take the path of most resistance and turn it into the path of least resistance." -- Dov Seidman


Daily Tech Digest - September 23, 2019

Artificial intelligence in marketing: when tech converges with the traditional

Artificial intelligence in marketing: when tech converges with the traditional image
The first step is to identify what is meant by a ‘buying coalition’. A buying coalition more accurately reflects the temporary alliance of distinct personas that are assembled to make a purchasing decision. Before, organisations don’t have automated contact creation, and it is only available in the sales reps mind. Now, organisations are able to more accurately target the right buyers at the right time. By applying AI to CRM data, organisations are able to create full contacts for all potential buyers engaged in the decision-making process by capturing relevant activity and associating with the correct opportunity. AI can then track all touch points with various opportunity contacts, analyse how they were engaged and who was engaged before and after them to show both the optimal number of buyers needed to close a deal and also how to sequence and communicate to these buyers in order to build a strategic coalition. This can only be done once AI has mapped the CRM data to the correct opportunity. We’re able to do this with our own persona-driven analysis as well.



Navigating the .NET Ecosystem

While it’s easy to get caught up in the past, and grumble over previous concerns and frustrations, we must move forward. Perhaps, arguably one of the most logical paths forward is to unify .NET Core and .NET Framework ... dare I say, "Let’s make .NET great again!" Maybe, I’ve gone too far, but let’s discuss the future. Where is Microsoft steering us? Let’s take a step back for a moment and discuss where we’ve come from, before diving into where we’re going. Not all .NET developers are aware of how their code compiles, and what is truly produces. "From the very beginning, .NET has relied on a just-in-time (JIT) compiler to translate Intermediate Language (IL) code to optimized machine code." — Richard Lander Revisiting my earlier mention of the Mono project, we know there have been significant efforts around making an ahead-of-time (AOT) compilation for .NET. Mono has achieved with its industry-leading LLVM compiler infrastructure. 



Since last year, Google has been working with online developer education company Udacity to provide free lessons and has now packaged them as 'codelab courses that are formatted like tutorials'.  The courses are aimed at developers who have some experience in programming object-oriented, statically typed languages like Java or C# and who've used IDEs such as JetBrains' IntelliJ IDEA, Android Studio, Eclipse, or Microsoft's Visual Studio. Students will need to install the Java Development Kit (JDK) and IntelliJ. Google promotes Kotlin as a "concise" and "modern object-oriented language [that] offers a strong type system, type inference, null safety, properties, lambdas, extensions, coroutines, higher-order functions".  It started offering free courses last year via the Kotlin Bootcamp course and is now offering them in the Google Developers Codelabs format. "Google and Udacity currently offer video-based courses for Kotlin Bootcamp and How to build Android apps in Kotlin," said Jocelyn Becker, senior program manager of Google Developer Training.



Your competitive edge: Unlock new possibilities with Edge computing

Today more than ever, there is little tolerance from employees, clients, or consumers when it comes to IT failures - fortunately we are entering a new age of technologies that will minimize this risk. Software defined networks (SDN) are becoming increasingly adept at identifying specific customer needs and running workloads to locations that best serve specific cost, latency and security requirements. When combined with the potential of AI to inform decision making, we can see a perfect storm of capabilities that will deliver a step change in how computing and networks converge to deliver a personalized service to customers who embrace these technologies. The added capability of MEC provides an extra layer, protecting essential services from outages or connectivity issues stemming from rare, but damaging ISP failures or cloud server downtime. This approach to network architecture can also enable local hosting of data, allowing data privacy to be better governed.


New application security risks lead IT teams to DevSecOps


This type of organizational change doesn't come easily, but progress is possible, Ewert said. "When I started four years ago, there was a real 'us vs. them' mentality between developers and security," he said. "It took us three years to get the culture to the point where they approach us [for help]." That shift came about through training sessions and explanations for why certain security rules and practices are necessary, Ewert said. The SecOps team also found unobtrusive ways to dovetail its efforts with the rest of the organization's; they piggybacked on an IT monitoring project that rolled out standardized monitoring agents across the organization to get security monitoring agents installed, for example. It has also been helpful for Ewert's team to frame security recommendations in terms of how they can make the application and infrastructure more efficient, such as by avoiding costly distributed denial-of-service attacks. "We only block a release if it's absolutely critical.


Data Everywhere At the Edge of Compute & Energy Efficiency

In traditional or classic software development, or SW 1.0, a typical project for a software team may involve creating algorithms and writing millions of lines of instruction code. SW 2.0 is a new way of thinking in which value creation comes not from code writing, but from data curation. We collect data from our devices, select relevant data sets, verify and label them. We then use that “curated” data to train machine learning (ML) models. Imagine the data involved in running a large printing press. Curating data from a printing press might include engineers looking at images of final press output, finding defects (e.g., lines, splotches, roller marks, etc.), then training an ML model to detect those defects. The model “runs” in real-time at the press to monitor output. With SW 2.0, simple problems are quickly trained, and issues that might otherwise be impossible just take a bit longer.


Enterprises can hire software robots at the push of a button


The IT services roles are internal, providing services to IT teams, but most of the others in banking, insurance and retail are customer-facing, with bots being the first point of contact for customers making enquiries. Dube said about 112,000 customer calls a day are handled by IPsoft customer Telefonica in Peru using Amelia. “These calls are made when people have a problem with their account,” he said. “Humans have to be liberated from these chores.” Dube described the jobs being done by robots as “high-friction, low-margin roles”. They are jobs that have to be done, have high costs, but businesses do not make money out of them. Firms are replacing large numbers of staff in such roles with software robots. The World Economic Forum’s Future of jobs 2018 report predicted that 75 million current jobs will be automated by 2022, and that 52% of jobs today will be done by robots by 2025.



Interview with Scott Hunter on .NET Core 3.0

Many times when we talk about NET Core 3.0, we talk about the new desktop support, but there is also a lot of innovation in ASP.NET. First up, while we are not bringing back WCF, we know that many developers want to program high-performance, contract-based RPC services in their applications. We are embracing the open-source gRPC project for these workloads. We are working to make the .NET implementation first class, and because it is gRPC, it works with many other programming languages as well. There is a new microservices-related Worker Service project for building lightweight background workers, which can be run under orchestrators like Kubernetes. Also, while ASP.NET has great support for building API’s, we want to make it easy to put rich security on top of them, so we are adding bridges to use our API’s with the open-source Identity Server project. Finally, we are working on Blazor, which enables developers to build high-performance web application using .NET in both the browser and server, using Web Assembly.


Innovation: How to get your great ideas approved


Historical precedent suggests that a top down approach to innovation can produce big benefits. Harvard Business Review (HBR) tracked the inventing history of 935 CEOs at publicly-listed US high-tech companies and found that one in five of these successful firms has what it refers to as an inventor CEO, who are bosses that have invented at least one patent. While the research is focused on the high-tech sector, HBR concludes that boards of directors should pay close attention to the inventor credentials of their executive teams. In the case of RBS, Hanley believes the senior team's role is to help point out opportunities and ensure any innovations help the business keep pace with a fast-changing finance market. "I guess our job is to not only think about the future, but almost work backwards from the future – we need to understand and have a point of view as to how we think the world is changing," says Hanley. "We need to articulate that view to all of our key stakeholders and then make sure we do something about it."


Cloud security: Weighing up the risk to enterprises

A good analogy is that clouds are like roads; they facilitate getting to your destination, which could be a network location, application or a development environment. No-one would enforce a single, rigid set of rules and regulations for all roads – many factors come into play, whether it’s volume of traffic, the likelihood of an accident, safety measures, or requirements for cameras. If all roads carried a 30 mile per hour limit, you might reduce fatal collisions, but motorways would cease to be efficient. Equally, if you applied a 70 mile per hour limit to a pedestrian precinct, unnecessary risks would be introduced. Context is very important – imperative in fact. The same goes for cloud computing. To assess cloud risk, it is vital that we define what cloud means. Cloud adoption continues to grow, and as it does, such an explicit delineation of cloud and on-premise will not be necessary. Is the world of commodity computing displacing traditional datacentre models to such an extent that soon all computing will be elastic, distributed and based on virtualisation? 



Quote for the day:


"Leadership does not always wear the harness of compromise." -- Woodrow Wilson


Daily Tech Digest - September 22, 2019

The augmented city: how technologists are transforming the Earth into theater

The Augmented City, by Scape Technologies
Want incredible immersive experiences in your city? Remaining technological hurdles include saving digital content to location, accessing a real-time 3D semantic world map, occlusion of digital content with the physical world, and multi-player. Centimeter positioning is required. However, Global Navigation Satellite Systems (GNSS) such as BeiDou, Galileo, and GPS fail to achieve this without the software and hardware to tap into geodetic infrastructure. Advancing capabilities of consumer cameras, leveraging dual raw GNSS data, 5G networks, and computer vision offer potential solutions, including triangulating position from landscape images snapped from a smartphone. 2019, Buckingham Palace, London. The palace is one of the locations augmented, or enabled, or activated, via Snap’s Lens Studio Landmarker enabling real-time AR immersive experiences. Studio has achieved over 400,000 AR lens and 15 billion plays. Snap currently has a US market catchment of 90% of 13-to 24-year-olds, a higher share than Facebook or Instagram.


Origins of Enterprise Architecture Frameworks

Origins of Enterprise Architecture Framework.jpg
Over the last thirty years, one EA framework has risen to become the most popular EA framework. That framework is The Open Group Architecture Framework, or TOGAF. The Open Group was formed in 1988 as a result of the merger of The Open Software Foundation and X/Open Company. The mission was to form a consortium that seeks to enable the achievement of business objectives through the development of open, vendor-neutral technology standards. The Open Group grew to over 650 active members who create standards for the field of computer engineering. Through this effort the Open Group created ArchiMate, a model that breaks down systems into active structures, passive structures or behaviors. TOGAF is currently in its tenth version, but the most widely recognizable feature of The Open Group’s TOGAF is the ADM, or Architecture Development Model. This model uses a cyclical approach to the development of an architecture. The cycle consists of developing a vision; defining the business, application, data, and technology domains; planning; managing change; deploying; and governing the architecture while maintain the requirements as a central focal point.


Make Artificial Intelligence Work for Your Business Needs

Image: Shutterstock
Enterprises beginning their AI journeys often rely on the services of the software provider or an AI development company to provide necessary customization. Some organizations, however, attempt to tackle the work in house, often with mixed results. "Having internal AI capability -– a combination of talent, platforms, tools, knowledge, relationship, and data -– offers the option of doing it internally versus outsourcing," said Monika Wilczak, an advisory managing director in artificial intelligence at business services advisory EY. "The stronger the internal AI capability, and more mature the enterprise is around the application of AI as a strategy for growth, the more likely it is to use their own data scientists and application engineers for customization," she explained. Still, even enterprises with full-fledged AI development teams can find customization to be an expensive and time-consuming undertaking. "Customization of vendors’ AI products requires data class inclusiveness, controls to avoid data bias, and the availability of a sufficient volume of labeled data,"



How To Drive Innovation During A Recession

Fast-Fail Innovation is technically easy for us to do, but we have no idea if anyone will buy these ideas from our company. This is where entrepreneurs play. Here you must go to market to quickly test and learn. You expect to fail fast and often before succeeding with an offering that may literally be refined by your customers’ in-market feedback. Unfortunately, although this type of innovation can be done quickly and inexpensively, your team must be ready to experience many, many, maaaaaaany failures before they find a winning, new idea. Under the pressure of a recession, teams are afraid to fail for fear of losing their jobs, so they will actively avoid engaging in the very activity that makes this quadrant successful. ... Differentiation Innovation is technically difficult for us to do, but we know our customers really want it. We know this because we can measure which problems to fix first, second and third. We can measure the size of each opportunity. We can measure the price customers will pay us if we address a specific need or problem.


Microsoft: Cyberattacks now the top risk, say businesses


This year, the second most widely considered top-five risk is economic uncertainty, followed by brand damage, regulation, and loss of key personnel.  The World Economic Forum (WEF) 2019 Global Risks Report ranks data theft and cyberattacks as top-5 risks in terms of likelihood, but they are behind extreme weather events and climate change concerns. Of course, since 2017 the world has seen the damage caused by the WannaCry ransomware outbreak, which the US government blamed on North Korea. It was shortly followed by the hugely costly NotPetya malware, which was blamed by governments in the West on Kremlin hackers. Criminal ransomware attacks continue to strike targets too, such as the attack on Norsk Hydro earlier this year that cost it $40m. And over the past few months, multiple US local governments have weathered targeted ransomware attacks with at least one attacker demanding a ransom payment of $5.3m. Lately, universities across the West have come under fire from state-sponsored hacking groups in search of intellectual property. However, these days business email compromise (BEC) is shaping up to be the most costly and common threat.


Facial recognition technology threatens to end all individual privacy

A surveillance camera
The consequences can be even more malign. Experts including the London police ethics panel argue that facial recognition could have a racial and gender bias. That is certainly what the American experience with this technology implies. The technology relies on sifting through the biometric data of thousands of people on criminal databases. But the datasets do not have enough information on racial minorities or women to be accurate. Many of these groups already have a deep mistrust of the police. Being wrongly targeted by a racially biased algorithm will not help this. And it is not just the state that is involved. An investigation by Big Brother Watch found that privately owned sites – including shopping centres, property developers, museums and casinos – have been using facial recognition, too. A trial in Manchester’s Trafford Centre scanned more than 15 million faces before ultimately being stopped in its tracks by the surveillance camera commissioner. ... Sadly, the high court in Wales did not grasp the conflict with civil liberties, recently ruling that a facial recognition trial by South Wales police was legal.


How a hacked Jeep Cherokee led to increased security from cyber carjackers


Harman saw its Jeep hack experience as a viable business opportunity: the supplier today sells cybersecurity software that allows automakers to monitor their fleets and provide over-the-air software updates. Analysts at IHS Markit consider Harman one of the top players in that segment, with some 20 automakers using its over-the-air services. Harman does not break out revenue for that business. But the company does try to recover some costs by charging higher prices for advanced security. "We have to educate our sales people in conversations with carmakers' purchasing departments and say 'don't let this go without adding cybersecurity to your quote'," said Amy Chu, Harman's senior director of automotive product security. Asaf Atzmon, the Israel-based vice president and general manager for automotive cybersecurity, said Harman has come a long way since he joined in March 2016 as part of the TowerSec deal. At the time, Harman employed only some security architects, and the company later changed its organizational structure, appointing or hiring professionals such as Wood and Chu to oversee cybersecurity efforts, Atzmon said.


Shared resources enable greater collaboration: big science in the cloud

Data science
The experience in developing DataLabs has provided a springboard for rolling out similarly collaborative platforms such as solutions supporting the Data and Analytics Facility for National Infrastructure (DAFNI). This is a project that aims to integrate advanced research models with established national systems for modelling critical infrastructure. “Led by Oxford University and funded by the EPSRC, the initiative aspires over the next 10 years to be able to model the UK at a household level, 50 years into the future,” explains Nick Cook, a senior analyst at Tessella. Here, the firm is involved in conceptualizing DAFNI’s capabilities and implementation roadmap. One of the project’s early goals is to create a “digital twin” of a UK city such as Exeter – in other words, to virtually describe a city with a population of several hundred thousand people together with its transport infrastructure, utility services and environmental context. This digital twin would, for example, help planners to decide where to invest in new road or rail networks, and to identify the best sites for housing, schools and doctors’ surgeries.


Automation in the workplace could disproportionately affect women


It wouldn’t be unprecedented. Decades ago, roles like “social media manager” and “data scientist” hadn’t been conceived, much less sought after. Krishnan said that typically, roughly 10% of employment at any given time is in these newly emerged groups of occupations, amounting to 160 million jobs globally. Whether they take up new work or acquire new skills in their current fields, Krishnan anticipates that tens of millions of workers will have to make some sort of occupational transition by 2030. Many of those workers are women — as many as 40 million to 160 million globally. Encouragingly, in both developed and emerging markets, the new jobs that are expected to come into vogue are likely to be higher-wage, according to Krishnan. Those jobs will furthermore involve less drudgery, which will be traded for tasks ostensibly more socially and intellectually stimulating. In fact, Krishnan believes that this future of work will require more interpersonal know-how of the workers who occupy its roles.


How Artificial Intelligence is Changing the Landscape of Digital Marketing

How Artificial Intelligence is Changing the Landscape of Digital Marketing
Artificial intelligence tools help digital marketers to understand customer behavior and make the right recommendations at the right time. A tool with the millions of predefined conditions knows how customers react to a particular situation, ad copy, videos or any other touchpoint. While humans can’t assess the large set of data better than a machine in a limited timeframe. You can collect the insights on your fingertips with the help of AI. Where to find an audience? how to interact with them? What to send them? How to send them? What is the right time to connect? When to send a follow-up? All these answers lie in the AI-powered digital marketing platforms. With a smart analysis pattern AI, tools can make better suggestions and help in decision making. A personalized content recommendation to the right audience at the right time guarantees the success of any campaign. Digital marketers are really getting pushed harder to demonstrate the success of content and campaigns. With AI tools utilization of potential data is very easy and effective.



Quote for the day:


"We can't understand someone else's ideas while we're busy thinking about our own." -- Tim Fargo


Daily Tech Digest - September 21, 2019

The Carbon Cost Of Digital Tech

Carbon cost of digital tech
“The cloud is more efficient, but that doesn’t automatically mean it’s more environmentally friendly,” Adams points out. “A less efficient data centre running on renewables will almost always be a better choice, environmentally speaking, than an efficient one that uses coal.” For the last 10 years, the Green Web Foundation has maintained the world’s largest database of website and digital providers using renewable power, making it easier for companies to find greener options. Becoming a sustainable, environmentally positive business isn’t just about decarbonisation. It’s equally as important, for example, to think about where investment is going. Take Google, a company that has taken significant steps to run data centres more efficiently. Last year, the tech giant was the world’s largest corporate buyer of renewable energy, and reached its goal of sourcing 100 per cent of its energy from renewable sources in 2017. Despite this, Google is far from perfect. “Google is funding climate deniers in the US, and supporting politicians who consistently cripple effective legislation in other sectors, as well as aggressively chasing business in the oil and gas sector,” Adams says.



The Struggles of Innovative People

Innovative people working together.
In his book, Adam Grant takes as an example a whole bunch of examples of social struggles in the USA, including among others the rights of black people and the fight for women's right to vote. He retains a pattern that comes up very often. First, people who would be described as radical and often violent highlight the problem. These are poorly monitored, misunderstood and fail, but not totally. Their message is heard by more moderate people, who want to go through significantly more legal means, and with a non-aggressive message. It is then that the people in power give in to their demands because the subject of the struggle is presented in a non-aggressive way, and heard by all. Unlike violent people. Some will argue that it is because violent people always threaten that governments negotiate with non-violent people. But from there we can learn two things. You have to get the message across while having people in power who fear something. It is the combination of fear + solution that makes people move.


Ransomware: 11 steps you should take to protect against disaster

Make money
It's not just criminal gangs that have noticed the power of ransomware: state-backed hacking groups have also used ransomware to create both chaos and profit for their backers. ... A recovery plan that covers all types of tech disaster should be a standard part of business planning, and should include a ransomware response. That's not just the technical response -- cleaning the PCs and reinstalling data from backups -- but also the broader business response that might be needed. Things to consider include how to explain the situation to customers, suppliers and the press. Consider whether regulators need to be notified, or if you should call in police or insurers. Having a document is not enough: you also need to test out the assumptions you have made, because some of them will be wrong. ... First, there's no guarantee that the criminals will hand over the encryption key when you pay up -- they are crooks, after all. If your organisation is seen to be willing to pay, that will probably encourage more attacks, either by the same group or others. There's also the broader impact to consider.


The Driver in the Driverless Car

This book came about from a simple observation. I noticed that even my techie friends in Silicon Valley were feeling overwhelmed by the pace of technological change. I also believe that the risks of letting technology just develop without thinking through societal implications is a massive problem - look at the rise of Facebook and all the problems that came because they refused to consider the privacy implications or the implications of their tools being hijacked for genocide and hate speech. ... Robots are good for three types of tasks: dirty, dangerous and dull jobs. Dirty jobs might be for example, cleaning out oil pipelines. Dangerous are jobs like bomb disposal or drones inspecting communications towers, infrastructure, or roof tops for faults and damage. Dull jobs are things like delivering food in a hospital or dispensing medicine. Curiously, autonomous vehicles are a great use case for robots - driving is both dangerous and dull. It’s also important to capture that a job can be both complex and based on repetition.


Adoption of AI Surveillance Technology Surges

Adoption of AI Surveillance Technology Surges
"Sadly I'm not surprised," says Alan Woodward, a computer science professor at the University of Surrey, commenting on the report's finding that AI surveillance technology is being rapidly embraced by governments. "Adoption of something this useful for security is bound to run ahead, and as is so often the case, particularly ahead of the legislation or regulation one might hope for." Technology, of course, is the limiting practical factor when designing more automated surveillance systems. But tool set capabilities and combinations have been rapidly improving. "Several breakthroughs are making new achievements in the field possible: the maturation of machine learning and the onset of deep learning; cloud computing and online data gathering; a new generation of advanced microchips and computer hardware; improved performance of complex algorithms; and market-driven incentives for new uses of AI technology," says Feldstein, who's also a nonresident fellow in Carnegie's Democracy, Conflict, and Governance Program. He formerly served as a deputy assistant secretary in the Democracy, Human Rights, and Labor Bureau at the U.S. Department of State.


How to better integrate IT security and IT strategy

How to better integrate IT security and IT strategy
“The human element is the biggest risk facing any IT organization today,” McGibney says. “A successful phishing campaign can easily bring a company to a screeching halt. To provide true defense in depth, IT and security need to work together to implement solutions across the attack surface, whether it be on-[premises] solutions or cloud-based. What the security group implements effects infrastructure and what infrastructure implements effects security. They truly go hand-in-hand.” IT and security teams need to understand what they are both trying to accomplish, and why it’s important to the organization, Wenzler says. “It’s easy to get risk strategies out of alignment with technology goals when the two sides don’t talk to each other,” he says. “While separate functions, they are integral to each other’s success, so without constant communication they’ll remain out of sync.” It’s also important for the two disciplines to build better relationships with each other. Information security people are sometimes seen as roadblocks to projects and hindering workflows, Cardamone says.


How a small business should respond to a hack

6 small business security
Responding to an attack starts long before it occurs. You should – if you haven't already – put in place an action plan for responding to an attack. All staff should know what is expected of them if the worst occurs, and particularly how to respond to customers who might be worried about their personal data being stolen. You should also prioritize the parts of your business that are most at risk during a cyberattack and focus your security measures on them. Many small businesses cannot afford to invest in sophisticated security measures for the whole of their IT infrastructure, but you can protect the systems and databases that contain the most sensitive information. Regular audits of the information you hold will also help you to identify exactly what has been stolen and will also help law enforcement track down the culprits. ... First, it's important that all of your staff know how to identify a hack at the earliest possible opportunity. If you can catch an attack whilst it is still in progress, all the better: this might allow law enforcement to identify the criminal immediately.


Java SE 13 adds performance, security, stability features

Gil Tene, CTO of Azul Systems, said he was skeptical of the sped-up Java release cadence, but the stability of the Java Community Process (JCP) along with the Java reference implementation and the Java Technology Compatibility Kit (TCK) gave him assurance that the speedier cadence would work just fine. Bruno Souza, president of SouJava, a Java user group based in Brazil, concurred. In moving to the six-month cadence, Oracle and the JCP kept its commitment to a fast, open source development model, but also kept the same standards of compatibility and quality, he said. OpenJDK, the open source version of Java SE, is where innovation will occur, Souza said. "And the TCK lets us verify that all these implementations run the same way," he added. Meanwhile, Java tools vendors have begun to take the Java release cadence into account for their products and services. For instance, Mala Gupta, a developer advocate at JetBrains, which produces the popular IDEA IntelliJ Java IDE, said JetBrains has a four-month release cycle that is tuned to keep up with each new release of Java.


Encrypted Smartphone Takedown Outed Canadian Mole

Report: Encrypted Smartphone Takedown Outed Canadian Mole
Bill Majcher, a former RCMP officer with extensive experience in conducting undercover operations, tells Global News that Ortis would have had access to almost any type of classified information, which the publication notes "could include the force's blueprints for covert operations worldwide, as well as the identities of undercover officers, police agents working within transnational crime groups, officers from Five Eyes partners used in RCMP probes, and even witnesses relocated to other countries." Lucki says the charges against Ortis have "shaken many people throughout the RCMP," as well as Canada's intelligence partners. "While these allegations, if proven true, are extremely unsettling, Canadians and our law enforcement partners can trust that our priority continues to be the integrity of the investigations and the safety and security of the public we serve," she says. The arrest of Ortis appears to have resulted from authorities taking down a secure smartphone service marketed to criminals.


Important Things You Need To Know About Agile Development

The sky is the limit with Agile development.
In the agile world, testing becomes a regular part of the process. Small pieces of the project are tested and presented on a regular basis. This gives everyone a better sense of the project’s timeline. In addition, this frequent testing allows developers to catch bugs before they become deeply entrenched in the code. ... Agile development does not mandate particular practices, but a number of standard practices have come into place as a result of the values of the agile model. One common example is paired programming. Here, two developers work together as they code a piece of the project. One programmer does the coding, and the other reviews the code as it is written. Another common agile practice is the daily standup meeting. Here, each team will give a status report to the project manager. Then, new goals are set for the current day. The final practice is working in sprints. Rather than set a long goal, many agile models encourage doing work in short bursts. Many offices use a two-week sprint model, where developers try to handle as many issues or address several user stories in a short period.



Quote for the day:


"Leaders must know where they are going if they expect others to willingly join them on the journey." -- Kouzes & Posner


Daily Tech Digest - September 20, 2019

Digitalization: Welcome to the City 4.0

Digitalization: Welcome to the City 4.0
Applied to cities, digitalization can not only improve efficiency by minimizing the waste of time and resources, but it will simultaneously improve a city’s productivity, secure growth, and drive economic activities. The Finnish capital of Helsinki is currently in the process of proving this. An early adopter of smart city technology and modeling, it launched the Helsinki 3D+ project to create a three-dimensional representation of the city using reality capture technology provided by the software company Bentley Systems for geocoordination, evaluation of options, modeling, and visualization.  The project’s aim is to improve the city’s internal services and processes and provide data for further smart city development. Upon completion, Helsinki’s 3-D city model will be shared as open data to encourage commercial and academic research and development. Thanks to the available data and analytics, the city will be able to drive its green agenda in a way that is much more focused on sustainable consumption of natural resources and a healthy environment.



How to decommission a data center

bulldozer wrecking ball deconstruct tear down decommission data center
"They need to know what they have. That’s the most basic. What equipment do you have? What apps live on what device? And what data lives on each device?” says Ralph Schwarzbach, who worked as a security and decommissioning expert with Verisign and Symantec before retiring. All that information should be in a configuration management database (CMDB), which serves as a repository for configuration data pertaining to physical and virtual IT assets. A CMDB “is a popular tool, but having the tool and processes in place to maintain data accuracy are two distinct things," Schwarzbach says. A CMDB is a necessity for asset inventory, but “any good CMDB is only as good as the data you put in it,” says Al DeRose, a senior IT director responsible for infrastructure design, implementation and management at a large media firm. “If your asset management department is very good at entering data, your CMDB is great. [In] my experience, smaller companies will do a better job of assets. Larger companies, because of the breadth of their space, aren’t so good at knowing what their assets are, but they are getting better.”


The Problem With “Cloud Native”

Digital data cloud, futuristic cloud with blockchain technology
The problem is thinking about and creating a common understanding around a change that big. Here the industry does itself no favors. For years, many people thought cloud technology was somehow part of the atmosphere itself. In reality, few things are so very physical: Big public cloud computing vendors like Amazon Web Services, Microsoft Azure, and Google Cloud each operate globe-spanning systems, with millions of computer servers connected by hundreds of thousands of miles of fiber-optic cable. Most people now know the basics of cloud computing, but understanding it remains a problem. Take a current popular term, “cloud native.” Information technologists use it to describe strategies, people, teams, and companies that “get” the cloud, and they use it for maximum utility. Others use it to describe an approach to building, deploying, and managing things in a cloud computing environment. People differ. Whether it’s referring to people or software, “cloud native” is shorthand for operating with the fullest power of the cloud.


Why You Need a Cyber Hygiene Program

cyber hygiene
Well-known campaigns and breaches either begin or are accelerated by breakdowns in the most mundane areas of security and system management. Unpatched systems, misconfigured protections, overprivileged accounts and pervasively interconnected internal networks all make the initial intrusion easier and make the lateral spread of an attack almost inevitable. I use the phrase “cyber hygiene” to describe the simple but overlooked security housekeeping that ensures visibility across the organization’s estate, that highlights latent vulnerability in unpatched systems and that encourages periodic review of network topologies and account or role permissions. These are not complex security tasks like threat hunting or forensic root cause analysis; they are simple, administrative functions that can provide value far in excess of more expensive and intrusive later-stage security investments. ... The execution of the most cyber hygiene falls squarely on the shoulders of the IT, network and support teams.


A Beginner's Guide to Microsegmentation

Image: knssr via Adobe Stock
Security experts overwhelmingly agree that visibility issues are the biggest obstacles that stand in the way of successful microsegmentation deployments. The more granular segments are broken down, the better the IT organization need to understand exactly how data flows and how systems, applications, and services communicate with one another. "You not only need to know what flows are going through your route gateways, but you also need to see down to the individual host, whether physical or virtualized," says Jarrod Stenberg, director and chief information security architect at Entrust Datacard. "You must have the infrastructure and tooling in place to get this information, or your implementation is likely to fail." This is why any successful microsegmentation needs to start with a thorough discovery and mapping process. As a part of that, organizations should either dig up or develop thorough documentation of their applications, says Stenberg, who explains that documentation will be needed to support all future microsegmentation policy decisions to ensure the app keeps working the way it is supposed to function.


Cryptoming Botnet Smominru Returns With a Vengeance

Cryptoming Botnet Smominru Returns With a Vengeance
Smominru uses a number of methods to compromise devices. For example, in addition to exploiting the EternalBlue vulnerability found in certain versions of Windows, it uses brute-force attacks against MS-SQL, Remote Desktop Protocol and Telnet, according to the Guardicore report. Once the botnet compromises the system, a PowerShell script named blueps.txt is downloaded onto the machine to run a number of operations, including downloading and executing three binary files - a worm downloader, a Trojan and a Master Boot Record (MBR) rootkit, Guardicore researchers found. Malicious payloads move through the network through the worm module. The PcShare open-source Trojan has a number of jobs, including acting as the command-and-control, capturing screenshots and stealing information, and most likely downloading a Monero cryptominer, the report notes. The group behind the botnet uses almost 20 scripts and binary payloads in its attacks. Plus, it uses various backdoors in different parts of the attack, the researchers report. Newly created users, scheduled tasks, Windows Management Instrumentation objects and services run when the system boots, Guardicore reports.


How to prevent lingering software quality issues


To build in quality, he advocates that IT undertake systematic approaches to software testing. In manufacturing, building in quality entails designing a process that helps improve the final product, while in IT that approach is about producing a higher-quality application. Yet, software quality and usability issues are, in many ways, harder to diagnose than problems in physical goods manufacturing. "In manufacturing, we can watch a product coming together and see if there's going to be interference between different parts," Gruver writes in the book. "In software, it's hard to see quality issues. The primary way that we start to see the product quality in software is with testing. Even then, it is difficult to find the source of the problem." Gruver recommends that software teams put together a repeatable deployment pipeline, which enables them to have a "stable quality signal" that informs the relevant parties as to whether the amount of variation in performance and quality between software builds is acceptable.


The arrival of 'multicloud 2.0'

The arrival of 'multicloud 2.0'
What’s helpful around the federated Kubernetes approach is that this architecture makes it easy to deal with multiple clusters running on multiple clouds. This is from using two major building blocks. First is the capability of syncing resources across clusters. As you may expect, this would be the core challenge for those deploying multicloud Kubernetes. Mechanisms within Kubernetes can automatically sync deployments on plural clusters, running on many public clouds. Second is intercluster discovery. This means the capability of automatically configuring DNS servers and load balancers with backends supporting all clusters running across many public clouds. The benefits of leveraging multicloud/federated Kubernetes include high availability, considering you can replicate active/active clusters across multiple public clouds. Thus, if one has an outage, the other can pick up the processing without missing a beat. Also, you avoid that dreaded provider lock-in. This considering that Kubernetes is the abstraction layer that’s able to remove you from the complexities and native details of each public cloud provider.


Microservices With Node.js: Scalable, Superior, and Secure Apps

Image title
Node.js is designed to build highly-scalable apps easier through non-blocking I/O and event-driven model that makes it suitable for data-centric and real-time apps. Node.js is highly suitable for real-time collaboration tools, streaming and networking apps, and data-intensive applications. Microservices, on the other hand, makes it easy for the developer to create smaller services that are scalable, independent, loosely coupled, and very suitable for complex, large enterprise applications. The nature and goal of both these concepts are identical at the core, making both suitable for each other. Together used, they can power highly-scalable applications and handle thousands of concurrent requests without slowing down the system. Microservices and Node.js have given rise to culture like DevOps where frequent and faster deliveries are of more value than the traditional long development cycle. Microservices are closely associated with container orchestration, or we can say that Microservices are managed by container platform, offering a modern way to design, develop, and deploy software.


Supply Chain Attacks: Hackers Hit IT Providers

Supply Chain Attacks: Hackers Hit IT Providers
Symantec says the group has hit at least 11 organizations, mostly in Saudi Arabia, and appears to have gained admin-level access to at least two organizations as part of its efforts to parlay hacks of IT providers into the ability to hack their many customers. In those two networks, it notes, attackers had managed to infect several hundred PCs with malware called Backdoor.Syskit. "This is an unusually large number of computers to be compromised in a targeted attack," Symantec's security researchers say in a report. "It is possible that the attackers were forced to infect many machines before finding those that were of most interest to them." Backdoor.Syskit is a Trojan, written in Delphi and .NET, that's designed to phone home to a command-and-control server and give attackers remote access to the infected system so they can push and execute additional malware on the endpoint, according to Symantec. The security firm first rolled out an anti-virus signature for the malware on Aug. 21. Symantec says attackers have in some cases also used PowerShell backdoors - also known as a living off the land attack, since it's tough to spot attackers' use of legitimate tools.



Quote for the day:


"A culture of discipline is not a principle of business; it is a principle of greatness." -- Jim Collins