Quantcast
Viewing all articles
Browse latest Browse all 30

Querying huge volume Data

Hi, We have developed an In-house Productivity Monitoring tool on our organization. Objective is, a client tool will be installed on all the users' desktop/Laptop, which will track all the activities of users and send the details to SQL Server database. There is an User Interface developed to see the details like "who are spending more time on social websites", "which team is more productive"...etc. Details can be viewed for entire organization, or a particular team, and can drill down to a single user. Problem is, there are around 200000 records inserted per day(100 users * 2000 rows/day) on a single table. It takes huge time to pull a report through our tool(even minutes), but at the same time, we tested some third party productivity Monitoring tool which provides the same report in fraction of seconds(irrespective of the volume of data we are fetching). The in-House Monitoring Tool and Third party application both are hosted on same server. Only difference is our application uses SQL Server 2016 and Third party tool uses MY SQL. Are we missing anything, how can we improve the performance of our application same as third party application. Note:- search condition for the reports might be anything or all of following: User, Team, Organization, date range, application type(social websites, E-Commerce website, business applications...etc). We have indexed all the search criteria.

Viewing all articles
Browse latest Browse all 30

Trending Articles