Showing posts with label EDI. Show all posts
Showing posts with label EDI. Show all posts

Daily Tech Digest - April 22, 2019

Fujitsu completes design of exascale supercomputer, promises to productize it

Fujitsu completes design of exascale supercomputer, post-K supercomputer
The new system, dubbed “Post-K,” will feature an Arm-based processor called A64FX, a high-performance CPU developed by Fujitsu, designed for exascale systems. The chip is based off the Arm8 design, which is popular in smartphones, with 48 cores plus four “assistant” cores and the ability to access up to 32GB of memory per chip. A64FX is the first CPU to adopt the Scalable Vector Extension (SVE), an instruction set specifically designed for Arm-based supercomputers. Fujitsu claims A64FX will offer a peak double precision (64-bit) floating point operations performance of over 2.7 teraflops per chip. The system will have one CPU per node and 384 nodes per rack. That comes out to one petaflop per rack. Contrast that with Summit, the top supercomputer in the world built by IBM for the Oak Ridge National Laboratory using IBM Power9 processors and Nvidia GPUs. A Summit rack has a peak computer of 864 teraflops.



As attacks get worse and more commonplace, it noted that companies need cybersecurity professionals more and more. But because of a perfect storm of scarce skills and high demand, security jobs come with a high salary, meaning that businesses not only struggle to find the right people, they have to pay top-dollar to get them. All of that means that cyber-criminals are having a field day, as the article illustrates. Attackers take advantage of ill-prepared companies, knowing that they are likely to be successful. It’s clear that the industry does need to improve, for the sake of customers and businesses alike. And to do that, we need good people, with the right skills. The industry has known for a while that those people are not easy to come by – there are simply not enough of them. There are a lot of reasons for that shortage, and it’s worth bearing in mind that it’s not the easiest industry to work in; the stress of the work means that mental health issues are rife.


Node.js vs. PHP: An epic battle for developer mindshare

PHP vs. Node.js: An epic battle for developer mind share
Suddenly, there was no need to use PHP to build the next generation of server stacks. One language was all it took to build Node.js and the frameworks running on the client. “JavaScript everywhere” became the mantra for some. Since that discovery, JavaScript has exploded. Node.js developers can now choose between an ever-expanding collection of excellent frameworks and scaffolding: React, Vue, Express, Angular, Meteor, and more. The list is long and the biggest problem is choosing between excellent options. Some look at the boom in Node.js as proof that JavaScript is decisively winning, and there is plenty of raw data to bolster that view. GitHub reports that JavaScript is the most popular language in its collection of repositories, and JavaScript’s kissing cousin, TypeScript, is rapidly growing too. Many of the coolest projects are written in JavaScript and many of the most popular hashtags refer to it. PHP, in the meantime, has slipped from third place to fourth in this ranking and it’s probably slipped even more in the count of press releases, product rollouts, and other heavily marketed moments.


Network analytics tools take monitoring to the next level

These tools help to identify problems as well as assist with capacity planning. Common tools include Simple Network Management Protocol (SNMP), syslog and Cisco NetFlow. While these tools provide some great information, they're siloed systems that work independently from one another. So, to perform any deep investigative work needed to determine the root cause of a particularly tricky network performance issue, IT staff would waste hours bouncing between tools. Modern network analytics tools provide a remedy to this time-consuming and complicated process. Network analytics software draws on traditional monitoring protocols and methods and then adds more sophisticated data flow collection methods. All collected data is then analyzed in real time using AI. By combining all data sources, the analytics platform can comb through far more information than ever before in order to make accurate network performance conclusions.


A Data Quality Framework for Big Data

A Data Quality Framework for Big Data
Data profiling is a good first step in judging data quality. But it is different for big data than for structured data. Structured methods of column, table, and cross-table profiling can’t easily be applied to big data. Data virtualization tools can create row/column views for some types of big data, where the views can then be profiled using relational techniques. This approach provides useful data content statistics but fails to give a full picture of the shape of the data. Visual profiling shows patterns, exceptions, and anomalies that are helpful in judging big data quality. Most “unstructured” data does have structure, but it is different from relational structure. Visual profiling will help to show the structure of document stores and graph databases, for example. Data samples can then be checked against the inferred structure to find exceptions—perhaps iteratively refining understanding of the underlying structure. Data quality judgment and structural findings should be recorded in a data catalog allowing data consumers to evaluate the usability of the data. With big data, quality must be evaluated as fit for purpose. With analytics, the need for data quality can vary widely by use case. The quality of data used for revenue forecasting, for example, may demand a higher level of accuracy than data used for market segmentation.


Google Expands ML Kit, Adds Smart Reply and Language Identification

In a recent Android blog post, Google announced the release of two new Natural Language Processing (NLP) APIs for ML Kit, a mobile SDK that brings Google Machine Learning capabilities to iOS and Android devices, including Language Identification and Smart Reply. In both cases, Google is providing domain-independent APIs that help developers analyze and generate text, speak and other types of Natural Language text. Both of these APIs are available in the latest version of the ML Kit SDK on iOS (9.0 and higher) and Android (4.1 and higher). ... Smart Reply allows for contextually-aware message response suggestions to be returned within a chat-based application. Using this feature allows for a quick, and accurate, response in a chat session. Gmail users have been using the Smart Reply feature for a couple years now on the mobile and desktop versions of the service. Now developers can include Smart Reply capabilities within their applications.


Can Blockchain Replace EDI In The Supply Chain? header
“Blockchain in B2B integration brings more agility. Today, B2B integration requires that both parties know each other, at least on a technical level, to provide ways to solve issues such as nonrepudiation and acknowledgement,” writes Forrester Research principal analyst Henry Peyret in “The Future of B2B Integration.” “Forrester expects that, in the next three to five years, blockchain technologies could be used to provide additional agility in building dynamic ecosystems.” Although EDI has built a 20-year track record of reliability, the venerable technology’s main weak point is its cost. “If there’s going to be a rationale for replacement, it might just be that blockchain is cheaper,” Fearnley says. But not everyone says the transition from EDI to blockchain is a done deal. “There have been many contenders to overthrow EDI over the years, and none of them have succeeded because EDI does what it does pretty well,” says Simon Ellis, program vice president of supply chain strategies at IDC. He adds, however, “If you can make things more secure and faster, everyone will benefit.”




coffee-cup-java.jpg
Despite that, Oracle stopped providing security updates to Java 8 in January 2019, in an attempt to force organizations into paid licensing agreements. Naturally, running out-of-date, insecure versions of Java is an exceptionally bad idea, presenting a conundrum to IT managers responsible for the deployment of Java applications: Either pay to maintain support for something that was once used for free, or—if even possible—attempt to move an application off of Java entirely. There is a viable third option, however: Using a non-Oracle distribution of Java. Because Java is still fundamentally open source, any organization that wishes to ship its own patched version of OpenJDK can do so. Red Hat—which contributes to Java upstream, and ships a number of their own products built on Java—is doing just that. Red Hat is taking the mantle of OpenJDK maintainer for versions 8 and 11, which will be supported until June 2023 and October 2024, respectively. New features are not expected for either version, as both are essentially in maintenance mode. 




Data center workers happy with their jobs -- despite the heavy demands
Overall satisfaction is pretty good, with 72% of respondents generally agreeing with the statement “I love my current job,” while a third strongly agreed. And 75% agreed with the statement, “If my child, niece or nephew asked, I’d recommend getting into IT.” And there is a feeling of significance among data center workers, with 88% saying they feel they are very important to the success of their employer. That’s despite some challenges, not the least of which is a skills and certification shortage. Survey respondents cite a lack of skills as the biggest area of concern. Only 56% felt they had the training necessary to do their job, and 74% said they had been in the IT industry for more than a decade. The industry offers certification programs, every major IT hardware provider has them, but 61% said they have not completed or renewed certificates in the past 12 months. There are several reasons why. A third (34%) said it was due to a lack of a training budget at their organization, while 24% cited a lack of time, 16% said management doesn’t see a need for training, and 16% cited no training plans within their workplace.




Closing the cyber security gender gap reflects the realities of the larger global cyber environment where there is diversity in gender, politics, social, economic, and culture. The bad guys are not only diverse in their thinking and actions, but also so are potential foreign security partners. As such, different perspectives and experience is a necessary complement to an industry that often hits an obstacle when it comes to language and terminology. More importantly, more inclusion into cyber security starts to tear down antiquated perceptions that the profession is geared toward males. This is almost ironic considering that women have played prominent roles in computers to include programming, designing computer systems to run U.S. census, and the software that supported Apollo 11 missions. Addressing the cultural perception of the cyber security industry is necessary in order to continue to better level employment levels. Part of this requires a review to ensure that compensation levels are equal. According to a 2017 global information security study, women earned less than male counterparts at every level.





Quote for the day:


"Surprise yourself every day with your own courage." -- Denholm Elliott


May 03, 2014

Intel searches for the value in open data
Intel is one of several large tech companies seeking economic value in open data. A research network called the Governance Lab, or GovLab, at New York University recently began publishing OpenData500, a list of companies using government data to generate new business, including include Amazon Web Services, Garmin, IBM and Yelp. In exploring open data, Intel’s hypothesis is that “any kind of silo-ed, isolated data set is. . . really limited in its ability to discover insights you didn’t know you were looking for,” said Brandon Barnett, director of business innovation.


Cathy O'Neil talks about trust in data analysis
I guess if I had to pinpoint my single most massive peeve, which really cannot be termed "pet," it would have to be hiding perverse incentives (and almost all incentives are perverse in some way) behind what people present at "objective truth". In my experience, outside of the world of sports where everything is transparent (except steroid use), there is always some opacity and gaming going on and someone's either making money off of it, gaining status from its publication, or wielding power through it.  And come to think of it, you've asked me the wrong question altogether. My biggest peeve with data interpretations is how many aren't published at all.


The promise of information
What sets information design apart from other design disciplines, aside from a commitment to what Two Twelve’s David Gibson listed as ‘hierarchy, logic, clarity, context’, is a belief in a kind of metadesigner, an ‘architect’ if you will, who will coordinate and transform information on behalf of the user. The ‘transformer’ was one subject of the first information design conference, so it was nice to hear Sue Walker from the University of Reading looking at how the Isotype folk, who coined the term, developed children’s information books in the 1940s and 50s.


Why the operating system still (kind of) matters
“If you look at the single-node Linux story, there is only one story, which is Red Hat,” Shuttleworth acknowledged. “What is more interesting, though, is if you look at Linux at large, you realize that single-node enterprise Linux story is a decreasing share of Linux in total. There are now vastly more Ubuntu servers running for enterprises than Red Hat servers running for enterprises. If you just look at what people are running on the web, for example, you see that very clearly.” So, he argues, as more companies start looking to build private clouds, they’ll want to keep those applications running on Ubuntu because its truer open source license structure is better suited to the idea of an elastic environment.


The Surprising Secret to Employee Engagement
Too often, Mark says, leaders fail to provide appreciation frequently enough. We often get so caught up in the push for continuous achievement that we forget to take to time to recognize what people have already achieved. Mark recommends that we actually schedule time for recognition each week. If it's on our calendars, we're much more likely to actually take the time to recognize what people have done well. He also recommends that we don't just recognize the top two or three performers. This can create a culture where most people don't feel appreciated.


Why Facial Recognition Isn't the Way of the Future...Yet
Jay Hauhn, CTO and VP of Industry Relations for Tyco Integrated Security, breaks down the use of facial recognition into two categories: cooperative environments and non-cooperative environments. In the former, the person whose face is going to be scanned is aware of it and is opting into a process where it's serving as their credential; they're going to look straight into a camera with no attempt to obscure their face. Non-cooperative environments, however, are when the subject is not necessarily aware that their face is being scanned and is making no attempt to look directly at the camera."In cooperative environments, it works fairly well," says Hauhn.


Microsoft Readies a Virtual Assistant for the Corporate World
“It knows everything I’m doing—what I’m reading, what I’m liking, who I’m following, the people I’m interacting with, who I’m responding to fastest—and serves up a personalized experience about what content is most interesting, what things I should be involved in, what people I should interact with,” says Julia White, general manager of the Microsoft Office suite. “My work is no longer about who sent me e-mail most recently; it’s about what’s most important to me.”  Oslo is the first app built on a platform known as the Office Graph, a database developed by the former employees of Fast Search & Transfer in Oslo, Norway, which Microsoft acquired in 2009.


Infor and 'No Fugly Software': Design as a competitive weapon
For Infor, design is therefore a euphemism for the broad collaboration associated with distilling processes and information down to what the user really needs, presenting that information in the most compelling and useful manner, and making it all look and feel good. Empathy for the user is central to this process. Although other large software vendors, like SAP, have embraced this kind of design thinking, the extent to which Infor is retooling both products and corporate culture around design appears unrivaled among companies of its size. As I noted on Twitter, Infor is actively trying to incorporate design as a core strategic theme into its cultural DNA.


Parsing EDI to XML (and vice verse)
Most of the articles related to EDI revolved around business controversies and comparisons between the different formats and dialects. Completely irrelevant to my research. I still don't understand why do so many EDI formats still co-exist nowadays (> 5000). It appeared to me that EDI was veiled in mystery and the lack of information and cooperation was not something to be considered as a simple act of randomness... I will leap over the entertaining side of EDI, like the conspiracy behind the multiple formats, the rebellious movement against VANs, and the ever ongoing discussion on whether XML will eventually bury EDI (with UBL being the latest contender). My goal here is to share my knowledge on the basics of parsing an EDI message, and hope that someone else may find that useful.


Why Is RAID Dying a Slow Death?
First and foremost, one of the more common RAID levels -- RAID 5 -- began to show serious weakness as disk sizes continued to grow ever larger. Today, there are disks on the market that are a whopping 5 TB in size, which is massive by the standards of the era in which RAID was born. Back then, RAID adapters could rebuild the relatively small disks of that era relatively quickly. That is, when a disk in an array failed, it didn't take too long to rebuild the failed disk. However, as disk capacity continued to increase, the amount of time that it took to rebuild failed disks also increased. The problem: During a rebuild, there is additional stress on the whole array as bits are gathered to rebuild the lost disk. As such, the potential for a double-disk fault increases.



Quote for the day:

“A person who cannot handle setbacks will never handle victories either.” -- Orrin Woodward

September 29, 2013

Steve Jobs Left a Legacy on Personalized Medicine
It turns out that Jobs was one of the first people—and certainly the best-known—to try this kind of all-in genetic strategy to beat cancer. As recounted in Walter Isaacson’s biography of the Apple CEO, Jobs spent $100,000 to learn the DNA sequence of his genome and that of the tumors killing him. Jobs was jumping between treatments and hoped DNA would provide clues about where to turn next.


Strategies for Information Governance
Information Governance is a combination of business practices, technology and human capital for meeting the compliance, legal, regulatory, security requirements, and organizational goals of an entity. Information governance provides a means to protect, access, and otherwise manage data and transform it into useful information.


The First Carbon Nanotube Computer
For the first time, researchers have built a computer whose central processor is based entirely on carbon nanotubes, a form of carbon with remarkable material and electronic properties. The computer is slow and simple, but its creators, a group of Stanford University engineers, say it shows that carbon nanotube electronics are a viable potential replacement for silicon when it reaches its limits in ever-smaller electronic circuits.


Privacy and Security by Design: An Enterprise Architecture Approach
The new paper explores the strong synergy that exists between the related disciplines of privacy and security. Privacy seeks to respect and protect personally identifiable information by empowering individuals to maintain control over its collection, use and disclosure. Information security seeks to enable and protect activities and the assets of both people and enterprises. While on the one hand, strong security is essential to meet the objectives of privacy, on the other hand, well-known privacy principles are valuable in guiding the implementation of security systems.


Data Integration: From Dark Art to Enterprise Architecture
There’s a renewed push across the industry to elevate data integration from being a series of one-off projects shrouded in mystery to the core of a multidisciplinary, enterprise architecture—incorporating new and existing approaches such as master data management, data virtualization, and data integration automation. By introducing enterprise architectural sensibilities to data integration, it can be turned into a process for innovation and delivery of productive change for organizations.


Layered Application Design Pattern
In critical application development, the most important thing is project architecture or project framework which comprises of various modules involved in request processing or business handling.  If the framework is clear and robust, and completely designed with proper planning then its very easy to do the code and develop the particular requirement for business. For any application if a developer knows the project framework/architecture then 50% work is done after that we need to focus on functionality only and need not to worry about the inter connected business component involved in the project.


10 Things to Know Before Moving E-Discovery to the Cloud
Cloud computing is attractive because it enables users to do more with less; however, with this great power comes great responsibility and risk. This risk comes in the form of legal, security, business continuity and compliance issues. When e-discovery is involved, the risk considerations become increasingly complex. If your organization is considering moving data to the cloud, or transitioning e-discovery systems to the cloud, follow this checklist of the top 10 things to consider before taking the leap.


Tokenization for De-Identifying APIs
Once the PII has been identified, the data can be de-identified using tokenization or encryption (including format-preserving encryption). Or the data can be anonymized completely via redaction. This policy can be generalized to proxy several APIs and replace any PII that passes through. This works particularly well for credit card or social security numbers, both of which follow a very well-defined and relatively unique pattern.


Developing a Legal Risk Model for Big Volumes of Unstructured Data
Unfortunately, mass-deleting information carries cost as well, including the loss of data that might be important to the business, and the risk of deleting information required for regulatory compliance. But perhaps the biggest cost of unidentified data comes when a company must pay to categorize data as part of a legal electronic discovery process. Opposing attorneys can—and will—force companies to produce every piece of data that might be relevant to the case.


An Enterprise Security Program and Architecture to Support Business Drivers
An understanding of Seccuris’ approach will illustrate the importance of aligning security activities with high-level business objectives while creating increased awareness of the duality of risk. The business-driven approach to enterprise security architecture can help organizations change the perception of IT security, positioning it as a tool to enable and assure business success, rather than an obstacle to be avoided.



Quote for the day:

“To be yourself in a world that is constantly trying to make u something else is the greatest accomplishment.” -- Emerson