Just before you start coding there are some general recommendations to make things faster:
- Is you data on a network drive, make a local copy
- What is the data format? Best to load the data into a file geodatabase
- The fields you join on, do you have attribute indexes defined on those fields?
- obviously the computer specs also have influence over the speed you will obtain.
30 minutes for 10.000 records (both featureclass and table) is very slow. It should be possible to do this a lot faster.
Have a look at "Example 2 - Transfer of Multiple Field Values between Feature Classes where there is a 1:1 Match between Field Sets" within the section on "Using a Python Dictionary Built using a da SearchCursor to Replace a Join Connecting Two Feature Classes". That seems exactly what you are looking for.
If you run into problems please postback the code you are using and a sample of your data and we will have a look.