silver83
New member
- Joined
- Aug 2, 2005
- Messages
- 3
Hi,
I have a datagrid bound to a table.defaultview.
I am using the defaultview.rowfilter property to filter the items I see on the datagrid.
I manually parse the filter string to a : "event_st_date > dd-mm-yyyy AND event_st_date < dd-mm-yyyy" format, and assign that to my rowfilter.
Wierd thing is - on some computers it only with a dd-mm-yyyy format and on others it only works with a mm-dd-yyyy format.
Im guessing its a framework revision issue, but all machines I checked were with the .Net 1.1 framework.
my only solution was to try either of these and chech the defaultview.count property for 0 rows, on such case - try the other option.
Im guessing this will a bad habbit because when the database will be big enough I will have cases where a wrong filter will still yield results. (according to a wrong format)
is there a smarter way? anyone knows a reason for this?
Thanks , Yoni.
I have a datagrid bound to a table.defaultview.
I am using the defaultview.rowfilter property to filter the items I see on the datagrid.
I manually parse the filter string to a : "event_st_date > dd-mm-yyyy AND event_st_date < dd-mm-yyyy" format, and assign that to my rowfilter.
Wierd thing is - on some computers it only with a dd-mm-yyyy format and on others it only works with a mm-dd-yyyy format.
Im guessing its a framework revision issue, but all machines I checked were with the .Net 1.1 framework.
my only solution was to try either of these and chech the defaultview.count property for 0 rows, on such case - try the other option.
Im guessing this will a bad habbit because when the database will be big enough I will have cases where a wrong filter will still yield results. (according to a wrong format)
is there a smarter way? anyone knows a reason for this?
Thanks , Yoni.