Skip to main content

Sample Technical Requirements for Hosting Jet Analytics

Background

The following article is meant to be a guide for hosting, either on-premises or within a public cloud service like Azure. One question that comes up frequently is how to determine how much physical RAM is required for the database server running SQL Server and the Jet Analytics BI Solution.

When considering this though, there is a sizable difference between the "minimum requirements" and the "suggested requirements".  These differences all depend on the size and number of the databases in use, the number of users, and the existing infrastructure in place.

The minimum requirements are easy to meet; at least 8 GB of RAM and enough hard drive space to hold their data warehouse, staging database, and cubes.  However, meeting the bare minimum is often not the ideal solution and providing better hardware to your server will allow for improved run-times and efficiency.

The ideal setup will obviously vary by client, but the following are some general ideas.  The main thing to keep in mind when reading the following and designing your server is scalability.  If better performance is desired in the future, you will need to be able to add more memory or processors to facilitate this while avoiding a complete server rebuild.

Considerations

  • Number of Users: If the number of users that you will be accessing and pulling data from the cubes is fifteen or less, then you do not need to take extra memory per user into consideration.  When more than fifteen concurrent users are likely to be using the cubes, you should consider adding 1-2 GB or RAM for every five additional users.

  • Execution Packages: How often does your data need to be updated?  If updates are required every thirty minutes, you need the hardware to accomplish this.  However, if the updates are completed after hours, the hardware requirements are lower.  SQL Server ETL transformations load the data being processed into your server's RAM before processing against it.  This means that the more RAM your server has, the quicker your execution packages will complete.

  • Size of Database(s): This is the most important consideration because it directly impacts how much processing must be done to populate the data warehouse.  In general, if you have a database that is <=50 GB, and/or have less than 16 GB of RAM on your server, that is adequate if you are only updating nightly and only anticipate ten concurrent users. On the other hand, if your DB is larger than 50GB, or you anticipate more than ten concurrent users, you should increase your RAM and disk allocation for SQL accordingly for scalability.

  • Rate of Growth: At what rate is the database growing per year?  Make sure to consider increasing business as a percentage of this growth as well.  Building a server to house all your data in its current state is short-sighted and could become problematic.

  • OS Requirements: A good rule of thumb is to reserve 1 GB of RAM for the OS by default, plus an additional 1 GB for each 4 GB between 4-16 and another 1 GB for every 8 GB installed above 16 GB.  What this looks like in a server with 32 GB RAM is 7 GB for your OS, with the remaining 25 GB dedicated to your SQL Server.

  • Dedicated or Shared: Will there be other software operating on your server?  When figuring out your RAM requirements and server hardware needs, you should make sure to take this factor into consideration.  If additional software is intended to be operating on the same server, you will want to allocate enough RAM and hard-drive space to each software.

Recommendations

For an 'average' client who has a database in the 100 GB range, we recommend something like the following:

High speed drives for the data (SAN is ideal) with separate storage for SQL log, data, and temp files.

Anywhere between 4-8 cores with a high-performing processor(s).

16-64 GB RAM.

  1. RAM
  2. Even still, the main thing to not forget is that you want your BI (Business Intelligence) solution to be scalable and extensible.  If better performance is desired, the ability to simply add more RAM will become crucial.  Many of our clients have found that by doubling their amount of RAM from 16 GB to 32 GB can cut the run-time of an execution package that normally took 4 hours to complete down to 2 hours.  As mentioned previously, SQL Server will load the data being processed into your server's RAM before processing against it.  This is to optimize performance so that all calculations are happening live rather than reading at each new row. If your server has enough RAM, and it is within the limits of your SQL Server edition, SQL will eventually read your entire database into RAM. To figure out what this 'best-case' amount of RAM is, add up the size of all your active databases (data only, not logs) and that is the amount of RAM you could potentially use, depending on your SQL Server version limitations.  This is obviously a 'best-case' scenario and is not realistic or necessary for all companies.

    Additionally, it is important to consider what version of SQL Server you will be using when determining your server's hardware.  The following image details out the RAM supported by the various supported versions of SQL Server.  These numbers are only representative of 64-bit installations.

      Standard Developer (BI) Enterprise
    SQL Server 2016 128 GB OS Max OS Max
    SQL Server 2019 128 GB OS Max OS Max
    SQL Server 2022 128GB OS Max OS Max
  3. HARD DRIVES
  4. When sizing the server, you ideally want something that is going to last you for a while.  If your current database is at 100 GB with an annual growth-rate of 15 GB, you will want to build your server with a larger hard-drive than 150 GB or you will outgrow it in 3 years, if not sooner.  If using a SAN array is an option (or if one is already in place) then that would be best to use for hosting your database files.  Otherwise, the recommended hard drive setup would include 3 separate high speed disk drives (i.e., 15k RPM). In either solution, our recommendation is to have dedicated drives for the following:

    Drive 1: Data drive (size of source databases x2)

    Drive 2: TempDB Database (variable, but generally100 GB or less)

    Drive 3: Standard C: drive to hold OS, Program Files, etc. (variable, but generally 100 GB or less)

    The location of these databases is set during installation of SQL Server. To move databases after installation please refer to this KB article.

  5. PROCESSORS
  6. This is not as important as the above related instances, but it is an item of note when determining your server's hardware sizing.  At a minimum, you will be able to run the BI Solution at any speed and size of processor.  However, 4-8 cores are recommended for smaller companies whereas larger companies should try to get closer to 8-16 cores.

    Summary

    The key thing to remember is scalability.  If better performance is required down the road, being able to expand on what you already have is important.  Additionally, when possible, the general recommendation is to start smaller and add as better performance is required.  The "ideal" server will vary in terms of component between companies, but hopefully the above will give you a good basis from which to base your decisions.

Was this article helpful?

We're sorry to hear that.