Author | Message | Time |
---|---|---|
Myndfyr | Hey -- I've started working on a new project.... Since the .NET platform has absolutely no good forum system, I've gotten permission from the YaBB team to put together a .NET version of YaBB. I have a bunch of fun ideas for the .NET version, particularly to take advantage of the whole web services paradigm. If anybody is interested in working on this with me, send me an e-mail -- robert@paveza.net. I'm programming the main aspect in C#, but the only parts of the project I reserve the rights to write code for are the data abstraction layer (almost done) and the XML data provider (work has begun). Other data providers, such as Access, MySQL, and SQL Server could be written, provided they behave the way that the DAL dictates, in any language you want. :) Let me know if anyone is interested. I'd love to get some people working on this. | February 20, 2004, 2:41 AM |
Myndfyr | For those who are interested, I am about 85% done with the data abstraction layer and about 40% done with the XML data provider. My goal for the first *preview* of this will be to have it working with everything built-in except for Polls, which will come a bit later. I still need to download the Advanced IM mod for YaBB 1.3 to see how that works, because that's about 10% of the DAL that I haven't done yet. Right now, the plan is to load the entire data set into memory via instantiating the Data Provider through the DAL (the DAL will be loaded into Application["DAL"] or something of the sort, and then the DAL will contain a reference to the data provider). Every thirty seconds and at Application_End, the data set will be persisted to disk (this timing set is up to the data provider developer -- the thirty seconds is my plan for the XML DP). All of the objects (instances of IForum, IUser, IBoard, etc) will be kept in Hashtables and will be associated with each other -- for example: [code] interface IThread { IUser[] Subscribers { get; } // ... } interface IUser { string Email { get; } // ... } [/code] So whenever a thread is updated, I can loop through each user in its subscribers list, get their e-mail addresses, and send them an e-mail that the thread has been updated. I imagine this is an approach similar to what you guys are doing with Object Perl, but since I don't know Object Perl, it's a guess. ;) Anyway, the theory is by keeping it in memory I'll be able to cut down on processing time and disk access, at the cost of a little bit of memory. | February 24, 2004, 3:12 AM |
Adron | [quote author=Myndfyre link=board=22;threadid=5373;start=0#msg45691 date=1077592327] Anyway, the theory is by keeping it in memory I'll be able to cut down on processing time and disk access, at the cost of a little bit of memory. [/quote] Are you sure about this? I don't know how large the db for this forum is, but I wouldn't be surprised if it's 100 megs or more. Sounds like using up quite a lot of memory. | February 24, 2004, 10:36 PM |
Grok | Not with ADO.NET. Because for the most part, ADO.NET uses disconnected datasets, it manages when a connection is needed to the backend for you. You can concentrate on your application coding. With legacy ADO, you could use disconnected recordsets but not by default, and they required extra coding. On top of that, if you're using SQL Server, and maybe Oracle does this too, your connections are pooled automatically at the server. As long as the credentials of an existing pooled connection are a match, the server can allocate it to your request. This saves on connection time, which is one major point of visible "lag" in client systems. If no pooled connection is available that matches the credentials of the request, you still need a new connection and must wait for it to be created and allocated. By optimizing the pooling characteristics for your application needs, it is possible to significantly speed up your database applications. SQL Server is pretty good at it already with the defaults, but you might be able to do better by understanding your application usage profile. Oh I only spoke about speed. You were talking about memory usage. By using client-side keyset cursors you can get by only doing reads of row details you actually need, and only have to get updates to specific rows which have changed on the server since you last disconnected (again, largely managed by ADO.NET). Depending on your cursor types and table schemas, you can get by with more trips to the database from your web server, and smaller memory needs. Unless your database is on a separate server than your web server, I don't see any advantage, not even slight, to opening the entire data set in memory. The RPC calls to your database should be lightning compared to the internet speeds of the client browser. | February 24, 2004, 11:37 PM |
peofeoknight | There are some very good asp.net forums out there if you look hard enough, but nothing like invision or phpbb... or yabb. There is the forum on www.asp.net but I am not too fond of it... its not open source so you cant mod it at all. | February 25, 2004, 12:54 AM |