Time Series Prediction Approaches

Time Series Journal

Subscribe to Time Series Journal: eMailAlertsEmail Alerts newslettersWeekly Newsletters
Get Time Series Journal: homepageHomepage mobileMobile rssRSS facebookFacebook twitterTwitter linkedinLinkedIn


Top Stories

True Democratization of Analytics with Meta-Learning By Mark Troester There are many solutions that claim to democratize analytics, but they are really constrained. A meta-learning approach democratizes without limits. The democratization of analytics has become a popular term, and a quick Google search will generate results that explore the necessity of empowering more people with analytics and the rise of citizen data scientists. The ability to easily make better use of your (constantly growing) pool of data is a critical driver of business success, but many of the existing solutions that claim to democratize analytics only do so within severe limits. If you have a complex business scenario and are looking to get revolutionary insights using them, it’s easy to come away disappointed. However, the democratization of analytics isn’t just a buzzword that refers to a n... (more)

Looking for answers in a bunch of charts and metrics? No thanks.

Let’s take a look at what many monitoring tools have in common When it comes to “full stack” performance monitoring, you will find a collection of vendors including New Relic, Datadog, SignalFx, Wavefront, and Sysdig offering solutions that are remarkably similar in many ways. With this article, I’d like to explore something that all these tools have in common – lots of metrics and lots of charts. Lots of easy to get metrics The first thing to notice is that these tools can capture metrics from a wide variety of data sources via “integrations” or “plug-ins” as shown in the screen shots (below).  Typical examples include host performance, JMX, network throughput, WMI, Docker health, SNMP, and AWS metrics. These metrics are relatively easy to capture (technically speaking, pulling JMX data is not hard to do) although configuring these integrations and plug-ins isn’t al... (more)

It's Official: Welcome to the 'Technology Bounce Back'

All the myriad commentators who monitor Internet technologies and the i-Technology companies on the NASDAQ doubtless have their own private cluster of indicators that they use to take a weather-check on the overall state of the industry. For some, it's as simple as looking at the NASDAQ index level. This (wholly understandable) approach is the one adopted by SYS-CON's own Roger Strukhoff, who wrote recently: After going over 5000 at the height of the dot.com bubble, we all know that it plunged precipitously and consistently for the next 18 months. Any hope of a quick recovery was dashed by 9/11, and then a new flicker of hope was extinguished when war came in March 2003. Since then, the NASDAQ's most important numbers have been 2000, 2000, and 2000. The first of the three numbers represents the year of its peak, the second the level at which it settled, and the thi... (more)

Data Landscape at Facebook By @JnanDash | @CloudExpo [#BigData]

Data Landscape at Facebook What does the data landscape look like at Facebook with its 1.3 billion users across the globe? They classify small data referring to OLTP-like queries that process and retrieve a small amount of data, usually 1-1000 objects requested by their id. Indexes limit the amount of data accessed during a single query, regardless of the total volume. Big data refers to queries that process large amounts of data, usually for analysis: trouble-shooting, identifying trends, and making decisions. The total volume of data is massive for both small and big data, ranging from hundreds of terabytes to hundreds of petabytes on disk. The heart of Facebook’s core data is TAO (The Association of Objects) – distributed data store for the social graph. The workload on this is extremely demanding. Every time any one of over a billion active users visits Facebo... (more)

Java Games Development - Part 2

Part 1 of this series appeared in the August issue of Java Developer's Journal (Vol. 8, issue 8). JDJ: I'd just like to pick up on that 85% portability goal Jeff mentioned earlier. I'm just going on assumptions, but I think if you were developing a title for the PS2, GameCube, and XBox you would attempt to make sure that only the graphics and audio functionality were platform-specific and make the rest of the game as portable as possible. Seventy-five to eighty-five percent portability would therefore seem to be an achievable goal in C/C++, in which case Java has just lost one of its advantages, has it not? Cas P: Usually I'm even more optimistic than Jeff about something here. I think I can achieve 100% portability. By focusing on a "pure Java platform" like the LWJGL (Lightweight Java Gaming Library), which, once you realize you're coding to the LWJGL Java API, not... (more)

A Nightclub in Your Pocket

4G will revolutionize wireless entertainment by allowing users to access content at broadband speeds. The killer apps for entertainment include gaming, books/magazines, gambling, video, and adult content. 4G wireless - wireless ad hoc peer-to-peer networking - eliminates the spoke-and-hub weakness of cellular architectures because the elimination of a single node does not disable the network. Simply put, if you can do it in your home or office while wired to the Internet, you can do it wirelessly in a 4G network. My son was playing Pokemon red version on his GameBoy the other day. Bored with that apparently, and bored with the other color versions of the Pokemon spectrum, and with no other kids within one meter to connect his GameBoy to via a cable, his journey with Pikachu ended for that day. But what if my son could battle against Ash, Misty, and Brock without a ca... (more)

Monster Hunting Yacht Charter

The Highlands of Scotland might not be the most obvious place to take a yacht, but a combination of sea-canals and the largest body of water in the UK make it a surprisingly accessible destination for all but the largest yacht, with a history which still echoes today and some of the most spectacular landscapes in the world. Loch Ness contains more water than all the rivers and lakes in the UK put together: it’s over 700 feet deep and 23 miles long, and the local peat makes the water extremely murky and ideal for hiding prehistoric monsters.  The size of the Loch can make conditions remarkably sea-like, with waves generally around 3 feet but often larger.  The top of the Loch is in the North East of Scotland, just south of Inverness, and along its length it heads South West diagonally following a line known as The Great Glen, which bisects Scotland in a series o... (more)

Seaspan Corporation Signs Contract to Build Two New 2500 TEU Vessels

HONG KONG, CHINA -- (MARKET WIRE) -- 03/30/07 -- Seaspan Corporation ("Seaspan") (NYSE: SSW) today announced that it has signed contracts to build two 2500 TEU vessels at Jiangsu Yangzijiang Shipbuilding Co. Ltd. ("Jiangsu") in China. These new orders are in addition to the eight 2500 TEU vessels Seaspan ordered from Jiangsu in 2006. The two newbuilding vessels will be delivered in March and June, 2010. The total delivered cost is expected to be approximately $46.4 million per vessel, subject to certain pre-delivery expenses remaining at budgeted levels. Seaspan also announced that it has arranged simultaneous ten-year charter agreements for these two vessels with Kawasaki Kisen Kaisha, Ltd. ("K-Line") of Japan at a rate of $17,880 per day. K-Line is Japan's third-largest liner operator and is ranked 13th in the world by TEU capacity. Each new vessel is expected to ... (more)

Multi-Core and Massively Parallel Processors

As software developers we have enjoyed a long trend of consistent performance improvement from processor technology. In fact, for the last 20 years processor performance has consistently doubled about every two years or so. What would happen in a world where these performance improvements suddenly slowed dramatically or even stopped? Could we continue to build bigger and heavier, feature-rich software? Would it be time to pack up our compilers and go home? The truth is, single threaded performance improvement is likely to see a significant slowdown over the next one to three years. In some cases, single-thread performance may even drop. The long and sustained climb will slow dramatically. We call the cause behind this trend the CLIP level. C - Clock frequency increases have hit a thermal wall L - Latency of processor-to-memory requests continues as a key performance... (more)

Evolution of Web 3.0

Web 3.0 is a different way of building applications and interacting on the web. The core model of web 3.0 states that entire World Wide Web will be seen as a single database. Many tools are being developed through which interactivity between different websites with different data can be enhanced. Prediction is that Web 3.0 will ultimately be seen as web applications which are pieced together. There are a number of characteristics of these applications: they are relatively small, the data is in the cloud, they can run on any device- PC or mobile phone, they are fast and customizable. Furthermore, the applications are distributed virally: literally by social networks or by email. That's a very different application model than we've ever seen in computing. However, there is still considerable debate as to what the term Web 3.0 means, and what a suitable definition might... (more)

Palisade’s DecisionTools Suite 5.7 Migrates to 64-Bit Architecture

New Software Accommodates Big Datasets, Complex Calculations London UK, 2nd November 2010 Palisade Corporation has announced the 5.7 release of its best-selling DecisionTools Suite, now available with support for 64-bit Excel 2010. The suite is an integrated set of seven quantitative analysis tools that run in Microsoft Excel and are widely used for decision support in business, finance, engineering and science. In recent years, computational tasks such as Monte Carlo simulation and neural networks have moved into the enterprise mainstream, and the data available for analysis have increased exponentially. As a result, decision-makers across industry sectors have often hit computational ceilings as their models become too big for their existing hardware to handle. 64-bit architecture allows computers to access more memory at once. While a traditional 32-bit system "m... (more)