Thread Problem

29/11/2006 - 18:39 por JC | Informe spam
Hi People,

Please I need your help.

This code run a thread ok but Not close later.

thanks...

private void RunServer(int aPortNumber)
{
_listener = new TcpListener(IPAddress.Any, aPortNumber);
_listener.Start();

_ServerThread = new Thread(delegate()
{
AcceptClients();
}
);
_ServerThread.Start();
}

public void Close()
{

_ThreadRun = false;

}


public void AcceptClients()
{
while (_ThreadRun)
using (TcpClient client = _listener.AcceptTcpClient())
{
if (client.Connected)
{
NetworkStream stream = client.GetStream();
byte[] data = new byte[1024];
stream.Read(data, 0, data.Length);
string request = Encoding.ASCII.GetString(data);

..
}

}

Preguntas similare

Leer las respuestas

#11 JC
29/11/2006 - 22:42 | Informe spam
Thanks every body.
Dave Sexton: Your point is reasonable. I take it.
Thanks
('_')

"ThunderMusic" wrote in message
news:uoEt0b$
yes, exception handling is a must to handle exceptional situations, but
the process should never be based on exception handling as you are
advising here... and yes there is harm because exception handling is very
costy resource-wise, so if you can avoid it, avoid it...

I really think you should read on asynchronous programming, it would
really help... I've done this kind of processing many times using
BeginAcceptTcpClient(...) without any problem when stopping my apps... I
don't have sample code at hand, but I know it worked fine... follow the
link I provided and you will most likely find what you need...

I hope it helps

ThunderMusic



"Dave Sexton" [remove.this]online.com> wrote in message
news:%23XbvSu%
Hi,

I've always learned that relying on exception as a normal process is bad
practice when there is another way to go



If the code is going to work reliably then exception handling is a must,
so there is no harm in simply returning in the case of an exception that
can be handled.

And in this case, there is another way to go : BeginAcceptTcpClient(...)
Just call _listener.BeginAcceptTcpClient(...) and everything will be
fine... ;)



<snip>

BeginAcceptTcpClient will not help here.

The loop still has to block somehow, and if you call _listener.Stop() an
exception will still be thrown even though Begin* is called.

IAsyncResult result = _listener.BeginAcceptTcpClient(null, null);

// now, we're just blocking here instead of there ^
using (TcpClient client = _listener.EndAcceptTcpClient(result))
{
...
}

You could have a sub-loop that checks an EventWaitHandle and a variable
such as _ThreadRun, as in the OP, but the code will become more complex
with no real gain. And, exception handling code should still be
implemented anyway.

Dave Sexton







Respuesta Responder a este mensaje
#12 Jon Skeet [C# MVP]
29/11/2006 - 23:42 | Informe spam
ThunderMusic wrote:
yes, exception handling is a must to handle exceptional situations, but the
process should never be based on exception handling as you are advising
here... and yes there is harm because exception handling is very costy
resource-wise, so if you can avoid it, avoid it...



Exception handling is not "very costly" - the performance implications
are almost never a good reason to avoid it. If a situation is
exceptional, an exception is appropriate. Exceptions shouldn't be used
without careful consideration, but for much design reasons more than
performance.

Just out of interest, how expensive do you think exceptions actually
are? Do you think you'd notice a significant difference in performance
if your application threw a thousand exceptions per second? I don't. An
application which threw a thousand exceptions per second would be very
worrying in terms of *design*, but the performance implications of that
are minimal when running outside the debugger.

See http://www.pobox.com/~skeet/csharp/exceptions.html for more on this
(including actual figures).

Jon Skeet -
http://www.pobox.com/~skeet Blog: http://www.msmvps.com/jon.skeet
If replying to the group, please do not mail me too
Respuesta Responder a este mensaje
#13 Dave Sexton
30/11/2006 - 02:17 | Informe spam
Hi Jon,

Great article.

You mentioned .NET 2.0 Beta 2 in the article, so I tested your code on my
desktop with .NET 2.0 (current, with 3.0 installed as well):

3.4 Ghz Xeon w/hyperthreading enabled (1)

( No debugger attached )

Total time taken: 00:04:44.0312500
Exceptions per millisecond: 17

Still remarkably fast, contrary to popular belief, but I noticed something
peculiar: processor usage was steady at ~50% the entire time I ran the
program (~50% per virtual processor). I tried 4 times, once in release
configuration, and the results were consistent between 16 and 17 exceptions
per millisecond. 2 of the tests were executed while running background
applications that are somewhat processor-intensive and highly
memory-consuming and 2 weren't, but that didn't seem to have any effect on
the outcome.

Any thoughts?

Dave Sexton

"Jon Skeet [C# MVP]" wrote in message
news:
ThunderMusic wrote:
yes, exception handling is a must to handle exceptional situations, but
the
process should never be based on exception handling as you are advising
here... and yes there is harm because exception handling is very costy
resource-wise, so if you can avoid it, avoid it...



Exception handling is not "very costly" - the performance implications
are almost never a good reason to avoid it. If a situation is
exceptional, an exception is appropriate. Exceptions shouldn't be used
without careful consideration, but for much design reasons more than
performance.

Just out of interest, how expensive do you think exceptions actually
are? Do you think you'd notice a significant difference in performance
if your application threw a thousand exceptions per second? I don't. An
application which threw a thousand exceptions per second would be very
worrying in terms of *design*, but the performance implications of that
are minimal when running outside the debugger.

See http://www.pobox.com/~skeet/csharp/exceptions.html for more on this
(including actual figures).

Jon Skeet -
http://www.pobox.com/~skeet Blog: http://www.msmvps.com/jon.skeet
If replying to the group, please do not mail me too
Respuesta Responder a este mensaje
#14 Jon Skeet [C# MVP]
30/11/2006 - 08:37 | Informe spam
Dave Sexton [remove.this]online.com> wrote:
Great article.

You mentioned .NET 2.0 Beta 2 in the article, so I tested your code on my
desktop with .NET 2.0 (current, with 3.0 installed as well):

3.4 Ghz Xeon w/hyperthreading enabled (1)

( No debugger attached )

Total time taken: 00:04:44.0312500
Exceptions per millisecond: 17

Still remarkably fast, contrary to popular belief, but I noticed something
peculiar: processor usage was steady at ~50% the entire time I ran the
program (~50% per virtual processor). I tried 4 times, once in release
configuration, and the results were consistent between 16 and 17 exceptions
per millisecond. 2 of the tests were executed while running background
applications that are somewhat processor-intensive and highly
memory-consuming and 2 weren't, but that didn't seem to have any effect on
the outcome.

Any thoughts?



Interesting. I've just rerun the test myself, on the same laptop as
before, but this time with Vista (so .NET 2.0 and 3.0) - I can "only"
throw 21 exceptions per millisecond.

That's still quite a lot, but it's about 6 times slower than it was in
1.1! Definitely time to update the article - it means that throwing
1000 exceptions per second (the example I used before) *would* have a
somewhat significant performance hit. My overall point is still valid
though, I believe - the design issues which are suggested by an
application throwing lots of exceptions are more important than the
performance implications.

Jon Skeet -
http://www.pobox.com/~skeet Blog: http://www.msmvps.com/jon.skeet
If replying to the group, please do not mail me too
Respuesta Responder a este mensaje
#15 Dave Sexton
30/11/2006 - 12:00 | Informe spam
Hi Jon,

<snip>

Any thoughts?



Interesting. I've just rerun the test myself, on the same laptop as
before, but this time with Vista (so .NET 2.0 and 3.0) - I can "only"
throw 21 exceptions per millisecond.

That's still quite a lot, but it's about 6 times slower than it was in
1.1! Definitely time to update the article - it means that throwing
1000 exceptions per second (the example I used before) *would* have a
somewhat significant performance hit. My overall point is still valid
though, I believe - the design issues which are suggested by an
application throwing lots of exceptions are more important than the
performance implications.



I agree. My opinion hasn't really changed in light of these new results.
An application throwing 1000 exceptions per second would still lead me to
believe that there is some sort of design issue or bug that needs to be
dealt with before considering any performance implications that it may have
on the system, but avoiding throwing exceptions because of concerns of
performance is being overly cautious.

As for the tests:

I tried NGen before running it once, just for fun, but the results were no
different.

Good idea to try it on Vista. I tried it myself and saw a slight (very
slight) performance increase but with some strange behavior in CPU usage.

I've got a second HDD in my computer so the remaining hardware, such as CPU,
was the same for these tests:

Windows Vista (TM) Beta 2
Evaluation copy: Build 5384
(Clean Install)

Total time taken: 00:04:25.8552000
Exceptions per millisecond: 18

(I had over 1GB RAM free during each of these tests and on the previous XP
tests, in cast that matters at all).

On a side note:

Just like the 4 tests that I ran on XP SP2, the average CPU usage on the 4
Vista tests was consistent at ~50%.

However, CPU usage was quite different between the two processors on the
Vista tests. (The 4 runs on XP were consistent at ~50/50, in terms of %
Avg. CPU usage).

(Remember, this is one hyperthreaded processor and both XP and Vista used
the same exact hardware except for HDDs)

Vista Beta 2 test results as viewed in the Task Manager Performance tab (4
distinct runs):

1. Sporattic, with other processes running.
(I then waited for the CPU to idle before running any further tests)

2. ~80/20, but with spikes every 30 seconds. Second CPU resembled an EKG.
3. Consistent at ~80/20.
4. Perfect square wave at ~60/40 and ~40/60, switching at about every 45
seconds.

The last 3 tests were run without any other GUI running. I wasn't even
moving the mouse :)

Dave Sexton
Respuesta Responder a este mensaje
Ads by Google
Help Hacer una preguntaSiguiente AnteriorRespuesta Tengo una respuesta
Search Busqueda sugerida