California health care businesses to create new database

There are many reasons why a company would choose to create a new database. In many cases, it depends on what the organization is trying to accomplish, and that can be a substantial task. For example, according to a recent article from the Los Angeles Times, Anthem and Blue Shield of California are partnering to create a massive database of patient medical records.

The goal is to create a more comprehensive collection of information. Now doctors and nurses will be able to access the medical histories of nearly 25 percent of all California residents with a few clicks of the keyboard. It will create faster, cheaper and better health care by reducing repetitive tests and procedures. Imagine an ER doctor that is able to access a patient's medical records instantly.

"We need to bring healthcare into the digital age, and by doing so you can really improve the quality and cost of care," Paul Markovich, chief executive of Blue Shield of California, told the news source.

However, there are challenges to having the $80 million system online by the end of the year. The biggest one is the lack of standardization is data. Companies have different kinds of data, definitions and fields, adding to the complexity.

With the growing use of digital technology, the amount of data that is available continues to increase. To improve the analysis of this information, a custom database software solution can help businesses organize their data in a better way that speaks to their specific needs.

How databases can help athletes perform their best

When you think about athletics, databases are likely the furthest thing from your mind. However, analytical may soon be as big a part of sports as the swish of a basketball net or the arc of a perfectly thrown football pass. 

Sports are about performance, endurance and grace, but they are also in many ways about information. When competing at the highest levels, it's critical to be able to identify strengths and weaknesses, and think about them in a way that minimizes the latter and spotlights the former. With the sort of databases that are currently available, any athlete or team that does not seriously consider how structured analysis can improve their chances of winning is doing themselves a disservice. 

However, simply investing in data is not very helpful if the organization has no context for, or ability to understand it. Drawing the correct conclusions from information can be an intricate process, so it's critical to have the proper support in place to ensure that the process is as worthwhile as possible. 

One of the epicenters for analytics in sports is MIT's Sloane Conference, where interested parties the world over come to discuss the latest and best in the field. Writing for its website, Professor Benjamin Alamar highlighted the need for careful consideration with sports databases.

"The push in sports—as in business—to use analytic tools comes from advances in computing power and the availability of massive amounts of data to both teams and the public, which create an opportunity for competitive advantage. Having access to information that competitors do not has a long history of providing teams and businesses with advantage," Alamar explains.

By consulting with reputable FileMaker developers, any athlete or organization can improve performance. 

Connecticut law firms improve operations with technology

Many people who are hot on the technology landscape have made the case that systems like mobile devices and the cloud are changing the way entire sectors of business operate on a fundamental level. This happens to businesses in all sectors.

A recent article from the Hartford Business Journal profiled Connecticut's legal sector, which has rebounded in recent years following the recession. Helping with the turnaround has been the influence of business technology. Specifically, this has included the use of mobile devices, encrypted networks and cloud storage.

According to Mark Dubois, a lawyer from New London and the impending Connecticut Bar Association president, the use of this technology has fundamentally changed how individuals define themselves as lawyers, how communication happens with clients and how legal services are delivered. This isn't doing the same task with different tool, but a refashioning of everything.

"Technology has been an important sentinel in his firm's march toward greater efficiency and security for client-lawyer communiques and documents," Thomas Marrion, managing partner in regional law firm Hinckley Allen's Hartford office, told the news source. "We think three to four years down the road, we'll see even more the benefits of investing more time and effort in that.''

The only problem is that there is no "one-size-fits-all" type of solution for all businesses. Instead, companies need to start looking at their specific situations and work on creating custom database software or examine the network to make sure that every system is deployed successfully and meets the specific needs of a company.

Databases could help gives clues to who will win Oscar races

For movie buffs, the Academy Awards are often compared to the Super Bowl. Once a year, the best and most accomplished get together to determine who performed the best over the previous year, replete with weeks of lead-in coverage and millions of dollars spent on performances and ads. And just like with the Super Bowl, interested parties enjoy making their predictions on who will win. Traditionally, these parties have included film geeks, entertainment press and industry professionals, but now, there's a new group among those ranks: data scientists. 

One company, ICC, has been delving into the Oscar-guessing world via Farsite, its analytical arm. They recently posted their predictions, which are based on the sophisticated modeling used by their custom database software. Farsite also posted a brief description of how their algorithms work, which is of particular note considering that their predictions were so close to spot-on last year. 

The company was able to accurately guess winners in five out of the six major categories, which include Best Picture, Best Actor, Best Actress, Best Supporting Actor, Best Supporting Actress and Best Director. They were particularly unique in predicting that Christoph Waltz would walk away with the hardware for his supporting turn in the Quentin Tarantino slave epic Django Unchained. 

In an explanation on their site, Farsite detailed some of the variables that go into making their predictions. 

"We'll use a first-of-its kind data-modeling tool to predict Oscar winners again this year. The model incorporates more than 40 years of film industry and Academy Award related information to forecast probabilities for the winners. This information includes real-time data and an array of variables, including total nominations, other Guild nominations and wins, buzz and nominees' previous winning performances," reads the site.

This Academy Award Season, they expect 12 Years a Slave to walk away with Best Picture, Alfonso Cuaron to win Best Director, Matthew McConaughey to walk away with Best Actor, Cate Blanchett to take down the Best Actress award, Jared Leto to win Best Supporting Actor and Lupita Nyong'o to get the hardware for Best Supporting Actress. 

Will they be right? The answers will be revealed on this Sunday's Oscar telecast. In any event, their predictive software indicates one of the many uses for programs like FileMaker, that are able to seamlessly capture and parse a large amount of data. 

What the White House needs to ask about Big Data

Big Data has had a transformative effect on the efficacy of business, so it's no wonder that the government would take note. That's exactly what's happening at the White House, which announced recently that it was forming a task force to study its effects, including those on policy, privacy and society. Not surprisingly, getting a full handle on custom web application development isn't easy. There is a lot of information floating around, and the industry moves quickly. To get the most out of this investigation, it will be important for the White House to focus on answering the right questions. 

While privacy and security are both pressing concerns, it's important that the White House is able to accurately evaluate the distinction. The former refers to how much data companies have access to and store, and the latter refers to how safe that information is kept from interlopers. In order to accurately create a policy to address both concerns, they first have to be considered separately. 

It's also worthwhile to figure out what exactly it means to be personal. With the amount of things people are now willing and able to share about themselves, the notion of what is for public consumption has shifted. If a group wants to use Big Data ethically, there needs to be a firm policy in place to learn what is fair game. 

Finally, the government has to understand that Big Data isn't just about what's happening today. It's also about where the industry is going in the future, and creating a system that is able to identify and adapt to shifts is the key to continued success. 

Is small thinking hurting your big data strategy?

Big data has become so prominent in the corporate lexicon that some individuals may mistakenly think they have a firm handle on the concept, when in fact their understanding is far short of where it needs to be. Even worse, many business managers are thinking about big data the wrong way, which can significantly limit the effectiveness of their data management strategies. 

An article in GigaOM suggests these people are thinking small about big data, asking questions that don't allow them to capitalize on the value of their information. According to the piece, questions like "how do we store all of this data" and "what's a different way to analyze this" represent the kind of thinking that hinders an organization's big data progress.

"This is small thinking. And it's dangerous," the article says. "Focusing on the technology and new forms of data in isolated and abstract ways will ultimately limit the value. Instead, organizations should be searching for ways to incorporate big data and data science into their existing capabilities. Preexisting business intelligence activities still have value but can be enhanced by adding new big data capabilities."

The key is to find ways to incorporate big data into your current operations and utilize it to your advantage. This might require the development of systems designed to facilitate big data management. FileMaker developers who can conduct custom Web application development have the ability to deliver systems to collect, manage and store information, all in a virtual environment, which allow you to capitalize on your information growth.

‘Big Data Stack’ could be next big thing

As more and more companies start to use custom database software to conduct their everyday business, they are in turn getting a greater understanding of how it works. This increased awareness is resulting in expansions on the traditional analytical framework, which could ultimately result in progress far beyond what we've traditionally come to understand as the limits of Big Data.

One advancement that appears imminent is the debut of the Big Data "stack". 

A slew of businesses have already made use of the first level of analytical capability, some 42 percent of organizations per a CompTIA study. These groups are now looking into what else they can do with the technology, and developing interfaces that are more layered, with a wider range of options available at each interaction. 

The first layer is where the data resides, an arena that is quickly becoming more scalable. In the next layer, companies will be able to integrate from other sources, which will help prep, clean and integrate the information so that it is even more useful. After this comes the analytical section, where conclusions can be drawn and action plans formulated. Finally, the predictive layer closes out the process, allowing companies to look forward and anticipate changes before they even occur. 

Richard Daley, one of the founders and chief strategy officer of analytics and business intelligence specialist Pentaho, explains why this approach will help to increase the value of the collection process. 

"In the last 12 months, we've seen more and more people doing big data for gain," he says. "There is much more to gain from analyzing and utilizing this big data than just storing it."

This represents another in a long line of exciting improvements in database technology. 

Big data enhances sociological research

Humans are a social species, and those interactions are being captured by data.

The average person produces just under a terabyte of information per year, and that number is only going up. For perspective's sake, imagine writing out each binary decision—represented by a one or a zero—by hand. If you scribbled out the amount the we all produce yearly, it wouldn't just extend to Saturn, it would extend to Saturn and back 25 times. Represented by sheep, the collective amount of data produced by people annually would fill the universe snugly, without any gaps.

It puts the "big" in Big Data.

All of this communication has an ancillary effect: sociologists, once constrained to wonder what people were saying and extrapolate from limited data, now have access to a greater wealth of first hand knowledge than they could have ever imagined being possible. And that number is growing quickly. It's lead many researchers to strongly consider the role something as fundamentally nonacademic as social media could yield in terms of lasting insight. 

Take, for example, Jon Levin. The Stanford economist performed an investigation into the ways that vendors set prices on popular auction site EBay. By using custom web application development and the access he had to hundreds of thousands of decisions, he was able to parse out several important trends, which confirmed some theories of pricing but exposed some significant errors. By grounding a largely theoretical field in actually human interactions, he was able to improve both substantially. 

For his efforts, Levin was awarded a John Bates Clark Medal, the highest award given to an economist under 40. 

It's not an isolated achievement, either. A research team at Harvard was able to combine IRS data with school district information to map out the long term effects of being matched up with a good teacher during formative years. Not only did they find that it had an effect on college matriculation rates, it also had an impact on income and the neighborhood a student would eventually end up living in. 

This year, Raj Chetty, who led the study, also won a John Bates Clark Medal.

The takeaway here is clear: when focused through a sharp academic lens, what might seem like a sprawl of Big Data can yield some valuable insights. 

Custom database software spurs America’s Cup win

Choppy waters don't exactly have much in common with an office environment. One thing they do share, however, is that the principles behind Big Data can apply equally well to both. In fact, it was those ideas that propelled Larry Ellison to an incredible win in the America's Cup.

In a sense, it's not surprising that Ellison should have sought to marry analytics and sailing. His company, Oracle, is one of the biggest technology companies in the entire world. It was this savvy that he was able to apply to the design and tactics of his chosen boat for one of the most prestigious races in the world.

Oracle's victory was one for the use of analytics and custom database software. Both of its boats had hundreds of sensors, which recorded thousands of variables. Some of the data points were so discrete that they were measured six hundred times per minute. The sailors themselves were given electronic tablets to use, as well as wrist displays and wireless capabilities. All of this information allowed designers to better understand how the boat could handle different conditions, as well as how close it was to operating at peak efficiency at all times.

All of this data-gathering yielded some valuable insights. Instead of the longer 80-foot trimarans, the team deduced that 72 feet was the largest boat they could easily control. By deciding on a catamaran, they were able to achieve high speeds without sacrificing the necessary portability — boats must be capable of being transported in shipping containers and reassembled within two days. 

There's a valuable takeaway here, even for those not inclined towards the seas: the more information you can process and analyze, the more of an advantage you'll have. 

Could FileMaker be the solution to truancy?

Schools around the country have seen FileMaker support their education efforts, providing meaningful data on student achievement and helping teachers contextualize performance and provide help when necessary. But in order for the software to help students, they have to show up to school in the first place.

Can a database drag a student out of bed and into the classroom? Not quite, but it might be able to do the next best thing.

That's the ambition behind an initiative by the Arlington County school district. Administrators have tapped ten teams of data scientists to analyze anonymous information and extract any meaningful trends that could help them to address their dropout rate. A panel of educators and analysts will judge the most compelling submission and offer up a huge cash incentive: $10,000. 

Arlington's truancy rates aren't terrible: they've been slashed from 13 percent in 2008 to just 6 percent this year. But the county is anxious to do more, understanding that every student should be put in a position to succeed. Crowdsourcing insights from outside data teams is a step in that direction. 

"Increasingly, people are considering this [data] a public resource. At the end of the day, it was created with public dollars," said Chris Kingsley, a policy analyst at the Data Quality Campaign. "If we can publish it and let other people come in and look at it, we can derive more value out of this data."

The teams will get access to information on assessment scores, schools attended, courses taken, grades, absences, demographic information and graduation status. If they're able to extract salient takeaways, it could push Arlington County's dropout rate even lower, a powerful benefit in its goal of providing a quality education for every student.