How to Take Out the Trash: Weeding Out Bad Data & Keeping It Out

By:


A good way to prevent data contamination is to set up a categorical, tiered system to aid in tracking data through the complete process and to an eventual end goal, with constant and consistent monitoring to make sure the data stays on track and setting outcomes to be measured in intervals.

These interval periods are further divided into three groupings: macro, micro, and sub-micro levels. If diagrammed, the appearance of this system resembles that of a tree: the macro level makes up the “trunk,” acting as the main portion to which all the “branches” (micro levels) are connected, with even smaller twigs (sub-micro levels) extending out from the branches. Also, much like a tree, the system may not only grow but also flourish and become beautiful, similar in look to a Mandelbrot set– if one can wax poetic about it.

Tracking data at the “trunk” macro level requires keeping the level tight and focused on a single subject. Adding too much information and too many outcomes at the macro level will make the micro and sub-micro levels virtually useless for analysis and could spread the information too thin, with infinitely complicated periphery that bares increasingly and more detailed ouroborian aspects. An example of an all-embracing macro level “trunk” is a social media campaign. Keeping the macro level tapered will allow for an easy gateway to interpret important baseline information to see if anything is amiss.

Beyond the macro level lays the “branches” micro level, which consists of segmented data from the macro level. Continuing with the example of a social media campaign macro level, it may be divided into micro levels based on the website or app used: Facebook, WhatsApp, Twitter, Instagram, Google+, YouTube, Snapchat, etc. The micro levels of a macro level may be whatever you wish them to be, provided they can effectively organize the more specific data and lend themselves to ease of access from the macro level to investigate anomalies (or allow in-depth investigation of anomalies on the macro level).

Beneath the micro levels lie the “twig” sub-micro levels, which are additionally specific; the sub-micro levels of the micro level of a social media platform could be distinctive, identifiable individual campaigns or pieces of media. Dividing the data even further allows for even easier access and more streamlined approaches, enabling those on the marketing and sales team to have an equal footing with those on the data analytics side and allowing both sides to combat any enigmas in the data. Using this type of a data integrity system to easily access the data sets allows the user to prevent or combat “garbage” from leaking in, and thereby mitigates it spilling out, where it can cause damage, saving users many headaches down the line.

Analyzing the Audience

Another helpful dissection may be the segmentation of the audience to better analyze response data from the client side. For example, an age range macro level could be divided into micro levels based on specific numerical ranges or psychographic segmentations such as shared personality traits, consumer beliefs, lifestyles or young adult (18-24 males or 25-34 females, etc.) and so on. Such data will be valuable to establish separate reports on distinct target audiences, allowing the user to dive into them to brainstorm or to work on a problem, solution or opportunity while knowing where all the information is and where to look to find it, much like the tiered levels of the corporate-side system.

Note that while this system in both instances should be able to allow more efficient data access and detection of anomalies, the system is still not perfect and may still require combing through a varying amount of information if incongruities or unpredictable events appear in the harvested data. Again, the most foolproof way to keep bad data from seeping into results is maintaining the initial commitment to data integrity and entry. Honesty and scrupulousness are still the best policy after all – and so is focus and being careful.

One last thing: Be wary of accidentally overanalyzing the data in the process (“paralysis by analysis”) Even if you think you may be able to get to the root of the problem if you look long enough, you may get overwhelmed and lose yourself in trying to process all the information at once. If you feel like you’re becoming swamped with facts and figures, remember to take a step back, breathe and relax.

You’ve now got a manageable system to work with at your fingertips.

Tagged:

Garbage In, Garbage Out: Why Bad Data is Worse Than No Data

By:


Since the 80s or 90s, computers have grown in importance not just in a personal sense but in a business one as well. While technology has made life easier, it’s still powered by man (for now!) and therefore is not entirely infallible. You simply can’t trust your insights when you can’t trust the inputs.

How does this concept relate to the education industry? Mainly, through hardware, sales software and analytical marketing tools: while the leap from sales binders to Excel spreadsheets may have made enrollment and sales data more streamlined and convenient, the results ultimately depend on the data inputted rather than the vehicle.

With human error occurring more than we want to admit, false or faulty data can still leak into a document or calculation and contaminate outcomes, resulting in misaligned marketing strategies, increased costs, and business instability. The problem becomes amplified when large and varied sets of big data need to be analyzed to help an organization make informed business decisions. This is the often a complex process of examining large and varied data sets to uncover information including mystifying arrays, undiscovered parallels, market developmental cycles and buyer biases that help administrations gain valuable insights, enhance decisions, and create new products. The relationship between bad input leading to bad output can be summarized by this phrase: garbage in, garbage out.

The evolution from Rolodex to a spreadsheet or even smartphone app has certainly streamlined collecting information, but it hasn’t entirely eliminated user error. Innovations in hardware and software have made it uncomplicated and cost effective to amass, stockpile, and evaluate copious amounts of sales and marketing data. If good information is input, then good data will be spat back out and vice versa, which may significantly affect planning, buying and selling decisions. In education marketing, user error makes it more difficult to know the client. In essence, bad data is as good as no data and perhaps even worse.

So, what can we do? While adherence to data integrity and entry along with correct set-up ensures the best and most accurate results, human error will always be a constant. Bad data input will always occur, but controlling for bad data, and engineering procedures to supervise data integrity successfully will help eliminate issues in decision making and avoid increased cost and organizational miscues. The best solution is to detect the ‘bad’ early and locate the problem before it gets worse. Fortunately, we can do something about data quality. No one wants to find out a pipe is clogged by the time their basement is flooded. Admitting that you have a data quality problem is the key to the solution.

Tune in to my next article to find out how segmenting data based on audience, system of controls, implementing a tiered tracking system and management oversight can help keep data on track. I’ll also provide an important warning about overanalyzing data that can save you great turmoil and stress.

Tagged:

The US Desperately Needs of a Department of Cyber Security

By:


As the world is turning digital, warfare is following suit in a very rapid and devastating way. Countless organizations in all sectors (Target, Equifax, DNC, IRS) are continuously reporting data hacks. According to the Government Accountability Office (GAO), federal civilian agencies reported 35,277 cybersecurity incidents, such as web-based attacks, phishing and loss or theft of computing equipment in 2017.

The public and private sectors in the US have not adapted to cyber threats. Instead of presenting a unified front for defending against these attacks, and have a plan to go on the offensive when necessary, most organizations are busy doing damage control by themselves, without any real long-term plan. This is done despite the fact that countless studies show year after year, that cybersecurity is the number one priority for all IT leaders.

A recent survey of government organizations, private sector and citizens in the U.S., China, Russia, and India found that more than 88% of participants believe that cyberspace threats are significant.

In the United States alone, state and local government IT leaders have maintained for years that cybersecurity needs to be the government’s priority. A 2018 Digital Cities Survey of city government IT leaders put cybersecurity as the top priority. The same survey of county government IT leaders placed cybersecurity at the top of the list for the past 5 years in a row. Lastly, the National Association of State Chief Information Officers (NASCIO) published their top 10 policy and technology priorities for 2019, and cybersecurity was named number 1.

The conventional literature throughout our country claims that cybersecurity is everyone’s problem, and that it needs to be dealt with on multiple levels within the government, private sector, as well as individual citizens. While it is true that cybersecurity needs to be fought for on multiple levels, this fight is extremely inefficient when everyone does their own thing, without a leading organization to set the policy and bear full ownership of outcomes.

The reality is that our nation’s current organization for dealing with cyber-attacks is doomed to fail. Responsibilities, skills and talent are spread across too many different parts of the government, which creates confusion, and most importantly, a lack of leadership and ownership.

For example, the Department of Defense, through its US Cyber Command arm, is responsible for national defense. The FBI is responsible for investigating and enforcement. The Department of Homeland Security oversees damage control and recovery for cyber-attacks. Lastly, every military branch has their own individual cyber units. Lack of communication and too much bureaucracy makes our cyber security efforts extremely inefficient, putting our nation at risk with each second that passes. Each one of these organizations have many other responsibilities and are stretched too thin to give cybersecurity the focus and resources it desperately needs.

President Trump is trying to rectify this situation by further centralizing the management and oversight of federal civilian cybersecurity through the National Cybersecurity Strategy of September 2018. This strategy will enable the Department of Homeland Security to secure all federal department and agency networks, with the exception of national security systems, the Department of Defense and the Intelligence Community. This is a step in the right direction, but it needs to be taken further.

There needs to be a department that is one hundred percent responsible for our nation’s cyber security, in the same way our military is responsible for our physical security. This department could be called the “Department of Cyber Security” (DCS) and it should set the policy, provide the proper organizational structure, and work with all other parties (government, private sector, and citizens) to gain control of our nation’s cyber security.

The new Department of Cyber Security’s top priorities should be to:

I. Request and maintain adequate funding – this is a top national security priority.

II. Mobilize our country’s best talent and resources to operate under a single umbrella and a single coherent policy.

III. Fill in the talent gap by promoting cybersecurity workforce, training, economic development. According to the “Presidential Executive Order on Strengthening the Cybersecurity of Federal Networks and Critical Infrastructure,” there is an estimated 299,000 shortfall in cybersecurity professionals across all industry sectors.

IV. Incentivize research, contests, hackathons – it must adopt and encourage ways of unconventional warfare.

V. Collaborate with the private sector to share threat intelligence on an ongoing basis, as well as new advances in the digital world.

VI. Outline liabilities, reporting requirements, and course of action for the other organizations to follow.

The United States must treat the issue of Cybersecurity with the same seriousness it treats the military. It must be organized from top down, it must be prepared to defend our networks and to attack at a moment’s notice. Not prioritizing cybersecurity policy leaves federal, state and local agencies, U.S. critical infrastructure, businesses and citizens extremely vulnerable to attacks that could be absolutely devastating.

Creating a new Department of Cyber Security that is one hundred percent in charge and responsible for our nation’s Cybersecurity is the only solution that allows our country to gain control of the cyber space, successfully defend our networks and be ready to go on the offense when necessary.

RESOURCES:

Tagged:

US Health Care and Health Tech Innovation

By:


On March 28, 2019 District of Columbia’s Judge John Bates made a decision to dismiss the US Labor Department’s association health plan (AHP) rule. The AHP rule allows small businesses the affordability to pool together and provide health plans for employees in a competitive landscape. Judge Bates, however, saw the rule as an initiative to avoid Obamacare regulations. Once again, competition and choice in healthcare has been denied US small businesses, the backbone of national employment. The question now begs: if we cannot come to a workable decision on providing transparent health insurance options to the American people, how will we move forward on improving healthcare? The cost of US healthcare is predicted to reach 20% of GDP by 2025. We need to do better.

According to the Centers for Medicare & Medicaid Services, healthcare expenditures have skyrocketed from $28 billion to $2.6 trillion over the past 50 years. US taxpayers are bearing the brunt of this increase, with no foreseeable solution coming forth to mitigate the burden. The remedy here is not to block competitiveness in health insurance services, but to increase health technology in our nation’s hospitals, health centers and clinics. We need to become proactive: that is to say, not throw valuable insurance money after reactive, outdated medical facilities and treatments, but invest in the latest preventative, diagnostic and reporting technologies to foster transparency and innovation. As stated in HP’s Megatrends, we need to “shift from standardized, reactive and centralized care to personalized, preventative, decentralized…care for all US citizens” through health technology.

First, let us clarify what health technology entails. Health technology refers to all advancements in procedures which improve both the quality and cost of providing healthcare to individuals and communities. The National Information Center on Health Services Research and Healthcare Technology (NICHSR) lists the following highlights that create a health technology demand:

  • Increasing prevalence of chronic diseases
  • Advances in science and engineering
  • Aging populations (baby boomers)
  • Increasing prevalence of chronic diseases
  • Third-party payment, especially fee-for-service payment
  • Financial incentives of technology companies, clinicians, hospitals, and others
  • Off-label use of drugs, biologics, and devices
  • Strong, growing economies

The United States fits all of the above requirements, and then some. The Trump administration has been on the frontline to advance health tech, starting with electronic health records (EHR). The Centers for Medicare and Medicaid Services (CMS) announced the MyHealthEData initiative in 2018. This program is supported by the White House Office of American Innovation, as well as the National Institutes of Health and Veterans Affairs, among others. MyHealthEData shall give electronic access of all health records to patients and allow patients to choose providers based on cost and accuracy transparency. Patients will be able to share their data with whichever provider they choose. This is a revolutionary concept in healthcare! According to Jared Kusher, President Trump’s advisor, the Administration is working diligently to solve the interoperability of health data within the nation’s healthcare institutions. Prior administrations have spent over $36 billion with no clear results in fully digitizing or maintaining the accuracy of health records. Lamar Alexander, R-Tenn and Chairman of the Health, Education, Labor and Pensions Committee supports the Administration’s interagency HER plan, stating an impact on over 125 million US patients. We are comforted to see such federal initiatives be redirected to solve transparency and cost issues within healthcare.

The US healthcare industry is geared to be most impacted by Industry 4.0. As delineated by HP’s Megatrends, digital technologies such as 3D printing and emerging technologies such as augmented reality haptic holography, microfluids and autonomous robotic caretakers are now a reality, and benefits both provider and patient alike when mainstream.

  • 3D printing has been introduced to many large private US hospitals, and is making strides in lowering the cost of customized diagnostic devises and patient implants. 3D printing can actually mass produce the hospital buildings as well as customized instruments, tools and medication needed for patients.
  • Haptic Hologram technology is no longer science fiction. This new augmented reality software converts 2D medical imaging such as MRI scans into virtual reality images. Why is this important? This technology allows the surgeon to do a holographic pre-run of surgery on the actual patient without dissection! So, when the actual procedure begins, the diagnosis, time and accuracy of surgery will be dramatically increased.
  • Microfluidics is described as “an entire lab on a tiny microchip.” The tiny microchips give a full diagnosis the patient using a minute sample of patient fluid. The process is not invasive, has a lower test cost and are perfect for point-of-care (POC) tests for urban and rural populations.
  • Artificial intelligence in geriatric health care sector is on the upward trend. AI makes hospices “smart” to actually providing individualized robot caretakers that can immediately detect health changes in the elderly patient.

These technologies are already in use and are being further developed by companies such as HP and IBM, with support from large private healthcare institutions via innovation labs. While many may initially believe these technologies to be expensive to implement, we recall that prior administrations have spent billions of dollars on health care ‘reform’ with little change in our healthcare crisis.

From a policy standpoint we applaud the White House Office of American Innovation and the Health and Human Services department for currently moving forward with the Administration’s interagency plan to improve electronic health records interoperability, and suggest working with such agencies as the National Information Center on Health Services Research and Healthcare Technology (NICHSR) on emerging technology assessments innovation labs to make US healthcare technology the most innovative and accessible to US citizens. The US healthcare insurance reform is currently in legal bottleneck to the detriment of the American people. It’s time to refocus time and financial energy on augmenting our actual healthcare institutions to provide the most beneficial, accurate and transparent healthcare through health tech innovation.

SOURCES:

Tagged:

Credit Reporting Reform: Individual Consumers Must Take Responsibility of Their Own Data

By:


In September 2017, Equifax announced that the information of 143 million of Americans had been hacked. This was just one of the latest companies to be compromised, joining Yahoo’s 1 billion accounts, JPMorgan’s 83 million accounts, and Target’s 40 million accounts hacked, among others.

What made this hack very concerning was the fact that Equifax is one of the largest consumer reporting agencies that collects our very personal and actionable information, including our names, birthdates, social security numbers, addresses, personal finances, credit card numbers, student loans, insurance of choice, rent payments, and others, without us knowing or giving consent, into a centralized database. 143 million accounts (60% of all adults in US) have been compromised. Our data, which we never offered or given permission to be collected and used, has been made available to malicious strangers. This is a very important topic.

The Fair Credit Reporting Act (FCRA), a law that was last updated in 1970 currently governs Equifax and the other credit reporting agencies. Since then, there hasn’t been any changes or updates, except in 2010, when Congress created the Consumer Financial Protection Bureau (CFPB) as the first federal agency with authority to examine and regulate consumer reporting agencies. While this was a much-needed addition, it does not provide the necessary requirements to keep our data safe.

Credit bureaus are treated much more loosely than banks, as they do not have the same regulatory oversight and do not have regular security audits. In the event of data breaches, such as Equifax’s, there is no specific federal entity designated to investigate the breach.

In response this tragedy, Rep. Maxine Waters has introduced the Comprehensive Consumer Credit Reporting Reform Act of 2017, which intends to be a complete overhaul the country’s credit reporting system. Among others, it plans to change the dispute process, switching the responsibility of proving accuracy of information from consumers to credit bureaus, restore the affected credit of victims of predatory activities and unfair practices, restrict the use of credit information for employment, rehabilitate the credit standing of struggling private education loan borrowers and limit the amount of time negative information can stay on a credit report.

The proposed changes of this act could positively impact consumers, but they do not specifically address the cybersecurity problem. This act does not provide a specific solution to preventing data breaches and protecting consumers’ information from hackers.

This is a new world defined by ubiquitous, overpowering cyberattacks that render all current cybersecurity systems inadequate and lacking. For the time being, unfortunately, it seems that there isn’t a hack proof solution of storing our data. So, if we cannot control who sees our data, we must at least be able to control, and limit the use of our data.

The best bet is to provide each individual person with their own ability to monitor and control access to their credit information. Regulators must require credit reporting agencies to provide free credit freezes to all people.

A credit freeze is a process that allows you to automatically block anyone from checking your credit, making it impossible for impersonators to open any line of credit under your name. If your credit has a freeze on it, you’ll be notified if someone even attempts to open a line of credit using your information. In the same way you have a 2-factor verification system for your email or cryptocurrency accounts, credit freezes can provide added security layers that consumers can monitor and control individually.

This way, you can keep your credit info in “dark mode”, and only open access to your credit in the exact instant you are applying for a loan, or do any other activity requiring access to your credit score. As soon as you were approved/denied, you can freeze your credit again.

Currently, credit freezes cost $20 each time you initiate it. And because you most likely must initiate a credit freeze for each of the big three credit reporting agencies (Equifax, Experian, and TransUnion), this cost adds up to $60 per credit freeze. Even more, there are hundreds other smaller credit reporting agencies, so this process can get rather complicated and tedious. New legislation needs to require this credit freeze process to be available, and preferably free (or much lower cost) for the consumer across all agencies.

This is a tremendous opportunity for the private sector to provide a much-needed solution: create a platform or application which connects with all credit agencies and offers consumers instant and painless options to take control over their data. Instead of logging on to multiple credit agencies websites each time they wish to freeze/unfreeze their credit profile, there should be a simple application that communicates with all credit agencies (or separate ones – depending on the consumers’ preference) and is able to freeze/unfreeze credit profiles with the simple push of a button.

This collaboration between the government and private sector must have the chief purpose of allowing individual consumers to control their own use of their credit profile, in the hopes of enhancing security. By definition, it is much more complicated, discouraging and fruitless for hackers to try to break into 143 million individual accounts, than it is breaking into one database holding 143 million accounts. As our banking and financial system is changing to provide consumers with more freedom over their money, perhaps it is time for the credit reporting agencies to do so as well.

Since the credit bureaus and regulatory organizations cannot protect our credit data, it is time to let the private market and individual consumers provide a smarter solution.

Sources:

Categories:
Tagged:
Page 2 of 3
1 2 3