Home » C++ Programming

Getting Url of items Subfolders

I need to find the complete url of items in a list's subfolders. I have 10 levels of folders in my list and need to keep track of the urls of each item in the list that gets deleted. In the browser the Url looks ugly as it appends something after AllItems.aspx?Rootfolder=%2fen%2dCA%2fForEmployees%2fDocuments%2fFolder1&FolderCTID=&View=%7b42C8B4C0%2d1C33%2d419E%2dABC9%2dBAC0B7788844%7d

Is it possible to retrieve this Url programatically?


1 Answer Found


Answer 1


you can try something like this 

using (SPSite site = new SPSite(SPContext.Current.Site.Url))


                using (SPWeb web = site.OpenWeb())


                    SPList list = web.Lists["Shared Documents"];

                    SPListItemCollection items = list.Items;

                    foreach (SPListItem item in items)


                        string url = item.Url;






Kshitij Bishnoi



Actually i am retrieving items in list.

How to get SPListItem item server relative url in SharePoint List.
I tried with item.Url, but it is not giving exact url.

Any help will be appreciated.
Thanks & Regards, Neerubee

Actually i am retrieving items in list.

How to get SPListItem item server relative url in SharePoint List.
I tried with item.Url, but it is not giving exact url.

Any help will be appreciated.
Thanks & Regards, Neerubee


My sharepoint lists have both out of the bix and custom XSLT views and for all the listitems with in these views using javascript I am not showing defalut ECM menu options (like View Item, Edit Item, Manage Permission etc), I created a menu option that will open the Listitem in an infopath form.

But when some one clicks on the title column value directly rather than selecting the only option from the title column drop down it is openning default displayform.aspx

so how can I disable that and make it navigate to my custom url...

I tried using supporting files too but there it will not show .xsn files.

Thank you.


Helllo - I have two webparts connected; one provider and one consumer.  When I select an list item in the provider webpart, the consumer webpart displays details about the list item.  When looking at the URL, it is:


If I open a new window and paste the URL, the id (#101) isn't selected because the JavaScript never executes.  When I hover over list item ID #101, this is the code that executes: 

javascript:SelectField('{0958f7b5-186d-494a-8e6a-ba483a608ff9}','101');return false;

I need to have the above URL in an email that the email recipient can click. 

Do I need to add javascript to my default.aspx page that will execute once the page loads? 

Is there an easier way to accomplish this?

Thank you very much!



I've a dropdown list for departments (engineering, bio, chemistry,physics etc.) when somebody choose the departments from the dropdown list, it displays the opening closing time(hours) in a detailsview controls.

I have separate homepage for those departments and I want to link the hours for each department from the departmental homepage.

When I choose the department from the DDL, everytime it's the same URL.

How can I solve the problem  So that I can point the respective hours from the respective departments ?


Thanks in advance.





Is it possible to create a listitem custom action with an url that contains {fieldname} instead of  {ItemId}, {ItemUrl}, {ListId}, {SiteUrl}, for example http://MYSPSITEURL/SitePages/custompage.aspx?Field=} ?



I have created a timer job which goes through my site collection, looking in all web sites, and all document based libraries, looking for items that are missing specific metadata fields. For every item that it finds, I want to add a new item to a Task List, with a description about missing fields, and a URL link to the Edit form for that record.

I have added a URL field to my Task List for the link to the Edit Form, but I am wondering about the best way to form the URL to the records EditForm. 

Given that the items will be all over the place, I gather I will have to build the URL, which shouldn't be to hard as I have all the info. I have the item ID which I can include as a parameter, but I suppose I am worried about the possibility of someone changing the default EditForm name for the list. Therefore I need to know if there is an easy way to get this from the list.

Ideally, I would also like to add the Source parameter to be my current list I suppose so that when the user finishes with the Edit Form, they come back to my Task List.

Any one got ideas as to how to acheive this?

Many thanks


I have a SharePoint 2010 Environment that uses https://portal.<something>.com.

Exporting to Excel of a list uses wrong URL.

When I go to the a list, select the list to show the "office ribbon" and then select export to excel, excel will show the error "Excel cannot connect to the SharePoint List".

When I take a look at the IQY file that is created it contains the wrong url: http://portal.<something>.com instead of https://portal.<something>.com

Changing the URL in the IQY file manualy to https://portal.<something>.com and then opening it will excel is working fine, list data is exported to excel then.

Alternate access mapping is configured to use https://portal.<something>.com as default, export to access is working normaly. I already deleted the http://portal.<something>.com alternate access mapping once to see if this fixes the problem. No fix, even not after IISreset or rebooting the server.

Where or how is the export to Excel IQY file created and where is it possible to fix the URL it is putting in the IQY file?




Hello Everyone

We've just upgraded our client Teamsystem from 2008 to 2010 plus the Power Tools. Now when we attempt to right click on the Work Item tab and "Copy Full Path" , paste to an outlook email, the path copied will not open. Here is the version info. Does anyone know if this is a known bug? It previously worked with 2008.

Here is what the "full path" looks like: http://teamsystem/wi.aspx?id=22153


Thanks in advance for any help. Lisa

Microsoft Visual Studio 2010
Version 10.0.30319.1 RTMRel
Microsoft .NET Framework
Version 4.0.30319 RTMRel

Installed Version: Standard

Microsoft Visual Studio 2010 Team Explorer   01011-532-2002361-70409
Microsoft Visual Studio 2010 Team Explorer

Microsoft Visual Web Developer 2010   01011-532-2002361-70409
Microsoft Visual Web Developer 2010

Microsoft Team Foundation Server 2010 Power Tools   3.0.30423.0
Power Tools that extend the Team Foundation Server integration with Visual Studio.

Microsoft Visual Studio Process Editor   1.0
Process Editor for Microsoft Visual Studio Team Foundation Server


I am in big problem which is very simple I believe for you. Hope someone will help me out.

I have a site in 3 environments. DEV, Staging, Production. I have a document library which has the infopath form template enabled in it and user has 3 infopath tempaltes in new button of library toolbar. User selects the template and save the infopath to the document library. A workflow is added on the library when a new item is added and it will trigger and fired on the item.

In the workflow I am writing some description and infopath url in it. In staging when I check the url, it is sending as below:


The above is the format I actually needed to open infopath form in browser instead of client application.

But, when I check for the url in the mail from any other environment like dev or production, it is sending as below:


But, this is opening the item sometimes in Infopath client and some times it is prompting for open, save and save as option in browser.

All environments are having the same site and configuration. I have checked the central admin, infopath configuration services and browser enabled option enabled. And infopath forms are also has the browser compatibility options enabled. Document library also has the browser enabled to "Display as web page". If all are having the same files everywhere, why it is sending different urls in emails?

Please help me out. My final goal is I have to get the first url which mentioned from Staging environment above, the infopath form should open in browser instead of infopath client as all don't have infopath installed in their server.

Please let me know, if you know any more information.



I'm using the sharepoint's web service (GetListItems) method to get some fields data for list's items, now in addition for the information that i retrieve for an item I need its direct URL , so i can give the user an optional link to click in order to navigate smoothly to that item in the sharepoint website

here is my code that get's the item's data


XmlNode ndQuery = xmlDoc.CreateNode(XmlNodeType.Element, "Query", "");
        XmlNode ndViewFields =
          xmlDoc.CreateNode(XmlNodeType.Element, "ViewFields", "");
        XmlNode ndQueryOptions =
          xmlDoc.CreateNode(XmlNodeType.Element, "QueryOptions", "");

        var cc = new CredentialCache();
           new Uri(url),
           new NetworkCredential(username, password, domain));
        SPListWebService.Credentials = cc;
        SPListWebService.Url = url + "/_vti_bin/Lists.asmx";

        //Query Options Node
        ndQueryOptions.InnerXml =
          "<IncludeMandatoryColumns>FALSE</IncludeMandatoryColumns>" +
        //Qyery nodeforeach (DataRow row in view.Rows)
          string v = row[ffdmn].ToString();
          ndQuery.InnerXml = string.Format("<Where><Eq><FieldRef Name='{0}'/>" +
          "<Value Type='Text'>{1}</Value></Eq></Where>", sfdmname, v);

          XmlNode ndListItems =
            SPListWebService.GetListItems(listName, null, ndQuery,
            ndViewFields, null, ndQueryOptions, null);

          foreach (XmlNode node in ndListItems.ChildNodes)
            if (node.Name == "rs:data")
              foreach (XmlNode innerNode in node.ChildNodes)
                if (innerNode.Name == "z:row")
                  //I do some logic here to get the targeted fields

any help?


i want to get full url for list item not the relative url of i.url, how?

 is it possible to create a new list item record by a url using paramaters




Hi all,

I'm writing a build script and i have some issue with the CL compiler.

I have a structure in my project where under the 'src' folder there are other folders, each one rapresenting a different namespace.

Now i'd like to pass to the CL compiler a path tha is like "src/**/*.cpp". That is, all the CPP files in any subfolder of the "src" folder.


If i try to give this to the CL compiler i get a "Invalid argument" error.

How can i achive to compile any .cpp files in any of the subfolders?






I have created a Team Project in TFS 2010. Under that I have created 2 folders Src & Test. I have also created a VS 2010 solution in the project, siblings to Src and Test folders. Under Src i will create the projects that make up my application. Under Test I will create projects containing my nunit tests.

I have added a checkin policy to the Team Project that requires Code Analysis run before checkin. I only want this to apply to projects under my Src folder, not those under Test. How do I do such a thing.

Thanks, Beezler



Recentlly, I found I can not get the code in subfolder of my project from VSS, I tried to use command line, it also did not work. Does anyone know how can I solver the problem.




I have created a usercontrol for search in moss. I have created pages for my site in "Pages" library. In this library i have one sub folder as "Application Submission" which also contains sone pages. Now when i am searching for "pages" then page stored inside "Application Submission" are not shown can anyone suggest me how to include pages inside a subfolder in custom search result.

Below is the code i m using for searching.


   mySite = new SPSite(SiteName);
            FullTextSqlQuery ftQuery = new FullTextSqlQuery(mySite);
            ftQuery.QueryText = "Select Title,Description,Path,hithighlightedsummary from         Scope()    where Contains ('\"" + strQueryText + "\"') and (\"scope\"='External Site')";
            ftQuery.ResultTypes = ResultType.RelevantResults;          
            ftQuery.EnableStemming = true;
            ftQuery.KeywordInclusion = KeywordInclusion.AllKeywords;
            ResultTableCollection resultTbls = ftQuery.Execute();

Synctoy 1.4 used to crash and Synctoy 2.0 got stuck at about 60-70% when attempting to sync a 'small' 30 gig folder.

OK, it's Nov09 and SYNCTOY 2.1 is released.  !!!!

Let's test to see if it can handle copying data from my source folder to a destination. 

The main reason I would like to use Synctoy rather than using Robocopy is to avoid unnecessarily recopying files and folders that you may have moved or renamed -  which should provide a vast speed improvement over using a simpler utility like robocopy.
To summarise the results below for Synctoy 2.1 testing:

-> Good news is that Synctoy no longer crashes, and no longer gets stuck.

-> Bad news is that if you use Synctoy 2.1 and create a new folder pair for a source and destination folder that ALREADY EXIST  (and are already perfectly synchronised using something like robocopy or perhaps by using synctoy 1,2 or 2.1 using a previous folder pair),  then it completely fails to recognise subfolders that you subsequently move or rename and it ends up copying them all over again,  leaving BOTH in the destination.

The following tests are using the 'Echo' feature of Synctoy 2.1.
This PC is an Intel Quad core 2.5GHz with 3g ram running Vista Ultimate 32bit.

My test folder is 184g (242,616 files in 9096 folders)
I'm still using robocopy to handle my backup to external usb drive; once the drives are synced  if I do a second pass robocopy takes just 4.5 seconds to whizz through the 242,616 files and show that there's nothing left to copy.

Using Synctoy 2.1 I created a new folder pair for this pair of folders using echo with standard options, and clicked Preview.

On the first pass I did think it was stuck at 60% but it carried on without problem taking 1 hour and 22 minutes. That's quite reasonable considering synctoy 2.0 got completely stuck on a folder a sixth of the size.  It has now created a database representing the file structure which should allow it to identify folders or files that were renamed, ultimately avoiding hours and hours of unnecessary copying and network traffic. It looks promising.

Strangely, out of the 242,616 files it did identify 15 files from two folders that needed updating,  even though I ensured the folders were completely in sync before I started.

I just used WINDIFF to check the source and target subfolders that contain these particular 15 files. Yup - the files in this folder are definitely identical, so I don't know why synctoy has identified that these files need to be overwritten. (The source and destination drives are both local and both NTFS).

OK, I let it run anyway,  despite showing me it had 15 files to copy from d: to f:,  it just actually only did 6 overwrite operations -  no other operations, (and 485,146 files did not require action which is presumably around 242+ thousand files on D and a similar number on F)

I windiffed the two folders again and all the files still match.

OK, let's see how long synctoy 2.1 takes now  on the second pass... there should be nothing to sync.  Wow... it took 2 mins 40 seconds - that's not bad at all !   Did another preview - now 2 mins 5 seconds.

So it's thumbs up for Synctoy 2.1 in terms of speed of working out its tasks, not getting stuck and not crashing.

Right, here's the big test for Synctoy 2.1:
Inside this 184g folder is a subfolder that contains 11.7gb and over 56,500 files including 1799 folders.   I'll now rename that subfolder on the source drive and then attempt to sync again with Synctoy.    Other methods of syncing including the robocopy script with the /MIR switch  would delete the 11.7 gb from the destination and copy the 11.7gb all over again.   Synctoy should recognise the name change and just rename the destination folder taking no time at all, though I appreciate it might accomplish it by doing 56,500 separate move operations - (which should still be a lot quicker than copying the 56,500 files! )

I clicked Preview to kick of the echo operation again. It came up with a vast number of delete and new operations and only a few renames. The preview took 35 mins 29 seconds and in total it found 76,738 actions to perform !!

That's a big fail for Synctoy 2.1.  What's the point of spending well over an hour to make that database on the first pass if it's not going to detect a single folder rename?

I then let it run - but I eventually stopped it when it had done over 19,000 operations after about an hour. I manually moved the folders back that it was copying unnecessarily and then deleted the 13186 files (2gig) that it had created so far in the destinations recycle bin.
OK, I can't trust synctoy now to get this right because I've messed about manually with the destination folder.  To get the destination perfectly identical to the source without deleting the destination and starting all over again, I used Robocopy with the /mir switch.  OK, the destination is now exactly the same as the source.

I then ran the preview once again on that large area but it now says there are 323 folders to delete, 6186 files to delete, 6186 new files and 323 new folders.   OK, actually, while doing my previous testing, I did do a few renames of other subfolders in both the source and the destination.  They do match perfectly, confirmed by robocopy, but of course synctoy has a database that reflects what it saw in the source folder the last time I ran it... so presumably it sees the changes I made and thinks it has a load of work to do, when in fact there is none to do. I'm not sure I like this idea of how synctoy checks against a database that it created last time - I think it really needs to check all of the destination files and folders each time.  It's really making the assumption that the destination is only ever changed by synctoy,  not allowing for manual copying or editing of files, nor for renaming of folders on anything but the original source.

If I could remember all of the folder names that I renamed and put them all back again, I'm sure Synctoy would then be happy. But the only way to sort this mess out now and avoid Synctoy attempting to perform all unnecessary moves and renames is to delete this folder pair, recreate it and run the preview again. I did this and after waiting for the hour and a half processing, confirmed it now shows 0 files to copy.

Let's test Synctoy 2.1 with something much much simpler.
I created a folder d:\test with 3 subfolders a,b, and c. They all have files, but folder b has 2 further subfolders called fff with 12 files and 2 further subfolders,   and ggg in which there are 7 files.
Let's sync that to f:\test using echo.
The preview took under a second. The run took 14 seconds.
Ran it again - took under a second to show that there are zero files to copy.
Right,  I now manually rename folder b to t
The preview again takes under a second and shows 5 folders to delete, 67 renames and 5 folders to create and it ran this in 1.5 seconds.
Well that worked perfectly... That said, it's a shame it doesn't realise it can just perform a single rename on the top level folder and instead do individuals moves of every file within!   But what went wrong with my large folder .. is it just a matter of folder size that confuses Synctoy ?

I'll try a smaller folder within the 140 gig structure. This folder contains 17 files, 164meg and includes one subfolder.  OK, it got that right....  just rename operations and create folders.
Let's try another folder -  194 files, 1.25 gig including 74 subfolders -  it got that right too apart from 2 files which it decided to delete and recopy.
Now try another folder - 902 files, 5.22 gig including 572 subfolders.  Now it's starting to go wrong.  It came up with 209 folders to delete and create, 332 files to delete, 571 files to rename (move), 332 smallish files to delete and recopy - (totalling 75 meg).  In this case that wouldn't take too long,  but why is it doing this?
Now a bigger folder -  6760 files , 15.99 gig including 2162 subfolders.  It came up with 739 folders to delete and recreate, 5775 files to rename (move), 1051 files to delete and re-copy totalling 103meg.  Again, interesting it is choosing small files to recopy. I checked out some of these files - they're not read only.  Perhaps I have a clue.... these files are all from a small set of subfolders.

I'll try setting up a new sync pair using one of these subfolders that it decided to delete/copy rather than rename/move. 
This folder structure is 1174 files, 80 meg and contains 389 subfolders.  I'll start off with the folder names being the same; the preview ran very quick and showed nothing to change. OK, change the subfolder in question inside the source folder - YES - it again says these are new files. The patch is very long... I'll subst a pair of drive letters for the source and destination a long way down the path so that the folders are no longer long.  This'll tell us if it's something to do with the contents of the folders, or just a problem with path length.  It still says 179 new files to copy.
I substed a pair of drives letters as close as possible to the folder in question and it still showed 179 new files to copy.
In case it's something to do with the files themselves, or their names, I'll copy them to a simple test folder d:\test - then I'll rename the single subfolder inside.  It still shows new files to copy.  This is now a simple folder with a single subfolder. Still the same problem.

Now let's just go back to creating a very small test folder with just 2 subfolders and 1 file  similar to the successful test I did before.  Well now this causing trouble with synctoy.  I can't work out why sometimes it successfully renames/moves files, and other times it wants to copy a new file  (and sometimes doesn't delete the original leaving 2 copies in the destination).  [EDIT - I've since discovered this occurs only if the destination folder already exists when you created the synctoy pair. It never goes wrong if you let synctoy copy the entire source to an empty destination folder on the first run.....   but this means you can never delete and recreate the folder pair unless you delete the entire destination folder as well].

I'll try one more - with my pics folder - 28024 files, 62 gig,    already robocopied to the destination folder. The preview operation successfully shows nothing to copy. Renamed a subfolder one level down, and another subfolder at level 2. It's got the whole lot wrong, synctoy wants to copy 1608 files totalling 4.6 gig. If I completely close synctoy, reload and try again, the results are the same.

I'm afraid Synctoy 2.1 echo just doesn't work !  I really was hoping that Synctoy2.1 would be a winner. The fault may lie within Synctoy itself or within Microsoft Sync Framework on which Synctoy is based - so you'd have to test other products based on sync framework to see if the same problem exists in them.
Most people will assume that synctoy would work without problem so I guess a lot of time is going to be wasted; I've just spent 6 hours trying to get it to work properly myself. I hope MS can come up with an update to Synctoy and/or Sync Framework asap.

Tim from howeasyisthat.com and brisbanepc.com.au

[I've made a number of edits mainly to make this easier to read,
and to include a summary of the findings right at the top]


The editor generates automatically two subfolders

../data/missions/arm and .../data tgl/arm

and fills those folders with entries.

Those entries are not helpful after finished game development. How can the automatic generation of these folders and entries be eliminated?


i have to rewrite this post because forums lost my previous message

this is the issue:

in MOSS 2007, subcriptions function fine execpt with subfolders
if an user subcsribe to a DL, he get notified when an item is added aas expected.

if the user subscribe to a subfolder, he does not get notified when an item is added to the subfolder

i have also check what is stated in the followinf link


eventcache table behaviour is correct, but alerts are not received by users.

in the quoted article it is suggested to use filemon (now process monitor).
i catched the log but it is not so easy to interpret

where can i investigate to solve this issue?


Davide Gatta

<< Previous      Next >>

Microsoft   |   Windows   |   Visual Studio   |   Sharepoint   |   Azure