On our production server, for some reason at specific amount of time, thread count goes over and over to the certain point that though CPU Utilization is normal(30-50%), but the query starting to run slow we so lot more blocking statements.
I am not sure where to look at it, basically when our site runs normally, the thread count is around 150 threads, but during the specific time in a day(during 1:30 to 2:30) it come up to 270 threads.there are no extra sql transaction goes on, everything
as normal as it was before but thread count grows and sql start behaving slowly.
After restarting the SQL service immediately thread count comes to normal, and our site function fine for another 24 hours.
3 Answers Found
---This first thing to check if CPU is at 100% is to look for parallel queries:
-- Tasks running in parallel (filtering out MARS requests below):
select * from sys.dm_os_tasks as t
where t.session_id in (
from sys.dm_os_tasks as t1
group by t1.session_id
having count(*) > 1
and min(t1.request_id) = max(t1.request_id));
-- Requests running in parallel:
from sys.dm_exec_requests as r
select t1.session_id, min(t1.request_id)
from sys.dm_os_tasks as t1
group by t1.session_id
having count(*) > 1
and min(t1.request_id) = max(t1.request_id)
) as t(session_id, request_id)
on r.session_id = t.session_id
and r.request_id = t.request_id;
Select * from employees where divisionID = 1 takes 3 seconds to complete.
Table holds 2835 rows, all are divisionID 1.
Server is 2.4ghz X 16 with 16gb ram.Read more...
I have an issue with a SQL2005; I have a query that when I ran it in management studio it ran in 8 mins, it's ok, this is an acceptable time for me, but when I put this query in a job to be executed by SQL Agent, it takes more than one hour to complete.
Does anyone have any hint or recommendations on what can be the cause if this issue?
How can I check the possible reasons of slow running sql server agent jobs on a sql server? I have checked that there is no CPU or memory pressure as such on the server.
Looking for some pointers to do further investigation.
I have written an c# application that runs bunch of SQL procedures to provide results for a game. The server is just one giant loop. To utililize more CPU cores I decided to divide the loop up 10 times on 10 threads. So a server that
would run a 10000 times would only have run a 1000 times per thread. This works great and if you do the math it is roughly 10 times faster on my i7.
The problem is if you change that 10000 times loop (takes 16 secs.) to a 100000 loop 10x larger you would think it would 160 secs but... it doesn't it actually takes about 5 mins. If you watch the system task scheduler when the program is running,
in the beginning the CPU utilization for each core is in the 90% and then after 10 secs it drops to like 50% for each core and remains there until the task is done.
Is this a SQL server or Windows throttling thing? Is there a way besides changing thread priority, to get more utilization per thread.
Any help would be appreciated
I have created an ASP.NET website using Visual Studio 2008. It is deployed to a local server in my office running Windows Server 2003. This intranet website is accessed by about 200 employees at a given time from 3 different corners of the world.
It runs fine on a local browser. But it is terribly hogging about 25% of the CPU memory when run in a Citrix environment. It runs in the IE 8 with IE 7 compatibility mode.
Many users open two browsers at a time to run different pages of this website.
The webpage has AJAX toolkit controls and lots of CSS defined.
Does anyone have any idea or experience regarding this?
I have two UPDATE queries, when executed together in SQL Command, takes about 5 minuts. but when executing together in management studio,totally takes about only 1 minute.
I have indexes on all the fields used in join. Have done some research and tries some suggested ways such as use various SET options. not quite useful.
FITokenCount = b.TokenCount
, FISignificantOPTotal = b.SignificantOPTotal
, FIInsignificantOPTotal = b.InsignificantOPTotal
n.InstitutionNumber = b.InstitutionNumber
n.DataItemUniqueID = b.DataItemUniqueID
WLTokenCount = wn.TokenCount
, WLSignificantOPTotal = wn.SignificantOPTotal
, WLInsignificantOPTotal = wn.InsignificantOPTotal
n.WatchListEntryNameID = wn.WatchListEntryNameID
--and wn.Type = 1;
Any inputs will be greatly appreiated.
I've a .net 4.0 Workflow service deployed on IIS/AppFabric, but when I invoke it, its count stays at 0 in the dash board, Event Collection service is in started state, no errors in the Event log, and I see ASStaging table has 108 records in the monitoring
db, which increases when I make service calls, I've reset IIS several times and restarted the Event collection service as well, but no luck, what could be the cause of this non-updation??
I'm trying to access a Visual FoxPro database that is setup as a linked server in SQL Server 2005. This works fine. The problem, however, is when I try to select data from a large table (~40,000 records).
I'm using a simply query such as:
SELECT Col1, Col2
WHERE Col1 = @param1
This takes a very long time (~1 minute 20 seconds). However, if do not use parameters, and do this:
SELECT Col1, Col2
WHERE Col1 = 'SomeValue'
It takes less than a second. I would like a way to get this working using parameters, if possible.
I'm using the latest Visual FoxPro driver for OLEDB (vfpoledb.dll version if 18.104.22.16815), and .NET 2.0, if that makes a difference.
Any help would be appreciated!
We have many tables with millions of records at SQL Server side and we want to migrate that data to Oracle tables. To Setup a linked server between SQL Server and Oracle,We tried with provider 'OraOLEDB.oracle' and 'Microsoft OLE DB for Oracle' but couldn't
get success because they failed while migrating big tables to Oracle database, also there were many other errors for those providers. But finally the combination of provider 'MSDASQL' with system DSN(name for reference -IA_SQL2ORA) to target Oracle worked
sucessfully for us, as we can now migrate data from SQL Server to Oracle.
Syntax for Linked server is : EXEC sp_addlinkedserver @server='IA_LINKEDSERVER',@srvproduct='IA_SQL2ORA',@provider='MSDASQL', @datasrc='IA_SQL2ORA'
But the problem is very poor performance. To migrate 24 GB of data, it took 80 hrs !! Eventhough for smaller SQL Server database migration, it is taking huge amount of time.
We have DB script that has simple code to migrate all our tables, here is sample for one table:
Insert openquery(IA_LINKEDSERVER,'select DBTime,DBUser,Description,SortOrder from ia_sys_status') select DBTime,DBUser,Description,SortOrder from [SQLSERVERDB].[dbo].[ia_sys_status]
During migration when we observe at Oracle side,there are lot of recursive calls happens. We can understand that at oracle side first all the data is kept to buffer and once all the data from SQL Server tables is selected, Oracle insert that data to respective
oracle table in one chunk.
We tried to insert records into Oracle table in small chunk by setting linked server 'rpc' and 'rpc out' settings to 'true', but not much imprrovement as lots of calls between SQL Server and Oracle.Could some one suggest me what could be the best approch
to improve the performance.
Please let me know if require more information.
Thanks in advance.
Hi, Following both scripts's last counting record goes less than 1300.
How I can get results more than 2000. Thanks.
with Q as
(select 0 as std union all select 1 union all select 2 union all select 3 union all select 4 union all
select 5 ),tn as
(select ROW_NUMBER() over (Order by (select 1)) as tm from Q Q1,Q Q2,Q Q3,Q Q4)
select * from tn where tm <= 3000 and tm % 4 <> 1
with Q as
(select 0 as Count_1_5_9 union all select 1 union all select 2 union all select 3 union all select 4 union all
select 5 ),tn as
(select ROW_NUMBER() over (Order by (select 5)) as tr from Q Q1,Q Q2,Q Q3,Q Q4)
select * from tn where tr <= 3000 and tr % 4 = 1
When I am trying to export the SharePoint View from SharePoint site to Spreadsheet, sometimes I am able to export the the whole data and sometimes I only get Column headers but no data or records
I have Infopath form (2007) published on SharePoint Site and I am using MOSS 2007 and Excel 2003
I am not able reach the cause of this behaivour of Export to Spreadsheet functionality
Is their a Buffer size Issue with IIS or Session time out
I am not sure because when I export to Spreadsheet and if the record goes to Spreadsheet the file size is just 1.72 MB
Hitesh DuggalRead more...
I have a ListBox whose ItemsSource is a List<string>. whenever it is updated the ListBox.ItemSource shows that there are elements in it but it is not visible. IF I start the app again it shows all the elements. what could be the reason?Read more...
In production enviroment we had SQL Server SP3 with lates Db security wrap (@9.0.4273) & currently maximum no users is 150
and now we are going to increase the no 800 more users, kindly advice what kind of parameter we have to increase in SQL Server level.
find below current configuration & advice where i have to change the value...
Windows memory = 12Gb, No of logical processor=8, Max memory=8Gb, Min memroy=512mb, awe enabled=1,
cost threshold for parallelism=5,max degree of parallelism=4,max worker threads=0,
remote access =1, remote admin connections=1,remote login timeout (s)=20,user connections=0,user options=0
I have a sql(2005) view which is running very very slow its takes 5 mins to run. But when i rip out the query from view and run it naked it hardly takes 5-6 seconds.
Can some one please advise me on how i can check what's going wrong with the view
I recently upgraded to Office 2010. My db was originally developed in access 2000. The db is saved on a server, which I have only access to, no programming privileges. Now, my database is running super slow. Example: I have
switchboard open in design view, I change the border style and switch to form view to see the result. Form view comes up instantly, no problem. However, whenever I change back to design view to make any additional changes, I must sit and wait.
I never know how long I will have to wait. We're talking several minutes here...I only started having this problem when I switched to 2010. Any suggestions?
Even working from a copy on my local computer runs slow.Read more...
What is the maximum size a database using SQL Server 2008 Express with Advanced Services
can grow to? I currently have one that is 204.44 MB with 0 space free. Growth is set to Auto. How can I increase the database
size in this version? I would like to allow it to be at least 400 MB? May I simply increase the Initial Size to be 400 MB; and then increase the log file size to be 2 MB (currenlty set to 1)? The enable autogrowth is set to grow automatically
by 10 percent. Read more...
I've a biztalk solution which I really care about the latency, that is why I read more about BizTalk Performance,
In the following page:
Disable hyper-threading on BizTalk Server and SQL Server computers
I don't understand why do I need to do that, if biztalk doesnt'know how to know how many processor (physical) are there in my server, why do I need to lose (20% to 30%) of my processing power (according to the documenation iself in the url above).
This is really stranage!
I am running BizTalk 2009 on a Windows Server 2008 VPC 2007.
System seems to be running extremely slow.
Processor Intel(R) Core(TM)i5 CPU M540 @2.53GHz 9Mz
I have allocated 2Gb to the VPC
Please could someone tell me how to improve the performance of the developer VPC?
I am using sql server 2005 to store binary data in multiple columns but when the application tries to insert data into the database the fallowing error is thrown saying row size cannot be more than 8060 kb . Can anyone tell me how can we resolve this issue
by increasing the row size of sql server or any other alternative solution.
Cannot create a row of size 29879 which is greater than the allowable maximum of 8060.\r\nThe statement has been terminated.Read more...
A server running a windows service (.Net 3.5 exe with a few associated class libraries) crashed last night. After the server came back up, the service (which makes a few web service calls and db accesses per work item) was running really slowly (each work item usually takes about 10 seconds, but it now is taking up to 20 minutes). There were absolutely no exceptions being thrown, db/ws timeouts... basically nothing out of the ordinary.
In the end, I replaced one of the dlls with a new version (which included some more tracing information), and that somehow fixed the problem. I then rolled back to the original version, and the problem was still fixed (surprisingly).
The best explaination that I have is that .Net keeps some form of persistent assembly-level cache, and this cache (or security policy cache, etc) got corrupted on server crash, which meant that for all subsequent processing, .Net was constantly having to validate checksums, try to download other versions of dlls from the web, or something equally as time consuming. I figure that by overwriting one of the original dlls, then replacing the new one with the old one again, it somehow forced a recalculation or purging of some low-level cache within the .Net framework, which corrected the issue.
As background info, the service simply polls a web service every minute (a system.threading.timer object), and then for any work items it can find, these are given to the application's ThreadPool to serve.
This service has been running fine for 6 months, and only started acting strangely after the server crash. All the databases and other components seem fine, and no other applications in the network have any errors or issues at present.
Note: while this service does make references to assemblies, the one that I replaced is a c# class library written specifically for the service, and is not registered in the GAC. It just resides in the services bin directory.
Any ideas?!Read more...