Ë«Á¢Öù¶Ñ¶â»ú--˵Ã÷Êé ÏÂÔر¾ÎÄ

the maximum deflection is

Where Ft1-the tangential force of the worm(N),

F r1-the radial force of the worm(N),

E-the modulus of elasticity(Mpa),

I-the inertia moment of the dangerous cross-section of worm(mm4)

L-the distance of the worm bearings (mm), L=0 .9muz1.

IV.FUZZY OPTIMIZATION MATHEMATICAL MODEL OF TRACTION MECHANISM

The key of this method is how to decide the optimal level value. Several factors, such as factor class, factor fuzziness and the different influence of the factors on the different optimal level values, were considered and the method of second-class comprehensive evaluation was used based on the optimal level cut set, thus the optimal level value

¦Ë*of every fuzzy constraint can be attained, that is ¦Ë*=0.71.Therefore the fuzzy optimization problem is converted into the usual optimization problem.

V.TRAINING RELATION COEFFICIENT BY NEURAL NETWORKS

Neural networks are composed of simple element operating in parallel. These elements are inspired by biological nervous systems. As in nature, the network function is determined largely by the connections between elements. We can train a neural network to perform a particular function by adjusting the values of the connections

(weights)between elements. Commonly neural networks are adjusted, or trained, so that a particular input leads to a specific target output based on a comparison of the output and the target, until the network output matches the target. Some points on relation curve between teeth number Z2 and the profile factor Yf of worm gear are selected as training sample data, the Fast Back Propagation are adopted to train feed-forward networks, the weights and biases of the network are updated. Then neural networks is simulated by the function of Neural Networks Toolbox in MATLAB. Program as follows:

Z2=0:10:90;YF=[2.58,2.5176,2.4566,2.3972,2.3392,2.2825,2.2273,2.1734,2.1208,2.0695] n1=5;[W1,b1,W2,b2]= initff(Z2,n1,¡¯tansig¡¯,YF,¡¯purelin¡¯); fpd=100;mne=20000;sse=0.001;lr=0.01;tp=[fpd, mne, sse, lr];

[W1,b1,W2,b2,te,tr]=trainbpx(W1,b1,¡¯tansig¡¯,W2,b2,¡¯purelin¡¯,Z2,YF,tp) y=simuff(Z2,W1,b1,¡¯tansig¡¯,W2,b2,¡¯purelin¡¯)

VI..SOLVING USUAL OPTIMIZATION MATHEMATICAL MODEL BY GENETIC ALGORITHM TOOLBOX

One key to successfully solving many types of optimization problems is choosing the

method that best suits the problem. The Genetic Algorithm and Direct Search Toolbox is a collection of functions that extend the capabilities of the Optimization Toolbox and the MATLAB? numeric computing environment. The Genetic Algorithm Toolbox includes routines for solving optimization problems using Genetic algorithm Direct search. These algorithms enable you to solve a variety of optimization problems that lie outside the scope of the standard Optimization Toolbox. Firstly the fitness function with penalty terms is built by penalty strategy with addition type, and the fitness function is programmed in MATLAB language, and above neural networks program fitting the profile factor of worm gear teeth is recalled, then the nonlinear constraints function are

programmed and the solver functions of Genetic Algorithm Toolbox are adopted. Program as follows:

options= gaoptim set (¡®PopulationSize¡¯,20); options=gaoptimset(¡®Generations',100);

options=gaoptimset(¡®CrossoverFraction¡¯0.95, ¡¯MigrationFraction¡¯0.01); options=gaoptimset('SelectionFcn', selection-tournament, ¡¯CrossoverFcn¡¯, cross over scattered,¡¯ Mutation Fcn¡¯, Mutation gaussian); nvars=3;lb=[1;2;10];ub=[2;8;150]; [x, Fval, exit Flag, Output]=ga(@fitnessfun, nvars, [],[],[],[],lb, ub, @yueshufun, options)

After function counting 108 times and iterating 326

times, the final running output of above programming is: x1=1.0102,x2=4.8889,x3=78.2222,f(X)=1090628. VII..CONCLUSION

This paper explored the methods available in the Genetic Algorithm and Neural Networks Toolbox. Compared with standard optimization algorithms(f(X)=1269257.5),

the objective function optimum in the genetic algorithm is about16 .37%less than the former. Therefore we saw that the genetic algorithm is an effective solver for non smooth problems. Additionally, we found that the genetic algorithm can be combined with other solvers, such as fuzzy logic and neural networks, to efficiently find a more accurate solution.

TABLE I

OUTPUT OF STANDARD OPTIMIZATION AND GENETIC ALGORITHM

¸½ ¼ B

ÔÚÉñ¾­ÍøÂçÖÐÆðÖØ»ú´«ÊäÒÅ´«Ëã·¨×î¼Ñ»¯

ÕªÒª£ºÄÇʧÕæµÄÊÊÒËÊýѧģÐÍÔÚÉè¼ÆÆðÖØ»ú´«Ê佨Á¢¡£ÄÇ·½·¨µÄ¶þµÈµÄ×ÛºÏÆÀ¼Û±»ÄÇ×î¼ÑµÄ°Ñ¸î¼¯ÅªÆ½ÕûʹÓþ­ÓÉ£¬ÄÇ·½·¨µÄ¶þµÈµÄ×ÛºÏÆÀ¼ÛÊÇʹÓþ­ÓÉÄÇ×î¼ÑµÄ°Ñ¸î¼¯ÅªÆ½Õû£¬Òò´Ëÿ¸öÄ£ºýÔ¼ÊøÄÇ×î¼ÑµÄ¼ÛÖµ¿ÉÒÔÊÇ»ñµÃŪƽÕû£¬²¢ÇÒÄÇÄ£ºýµÄ×î¼Ñ»¯ÊDZ»±ä³ÉÄÇͨ³£µÄ×î¼Ñ»¯¡£Éñ¾­ÍøÂçËã·¨ÄDZ³Ãæ¼Ó¹ÌÔö³¤µÄ½«²ÉÓõ½Á¬ÐøÐÔÇ°À¡ÍøÂçÈç´ËÊʺÏÏà¹ØϵÊý¡£È»ºóÄÇÓ÷£¿îÆÚÏÞÊǹ¹³ÉÓÉ·£¿î²ßÂÔ×°Å书ÄÜ¡¢Éñ¾­ÍøÂç¼Æ»®ÊÇÕٻء¢½âËã»ú¹¦ÄܵÄÒÅ´«Ëã·¨¹¤¾ßÏäµÄmatlabÈí¼þÊDzÉÓõ½½â¾öÄÇ×î¼Ñ»¯ÊýѧģÐÍ¡£ Ë÷Òý´Ê£ºÆðÖØ»ú»ú¹¹£»ÒÅ´«Ëã·¨×î¼Ñ»¯£»Éñ¾­ÍøÂç¡£ Ä£ºýµÄ×î¼Ñ»¯ÊýѧģÐ͵ÄÇ£Òý»ú¹¹

Éè¼Æ½¥¿ªÏßÂÝÐý×´µÄÎÏÂÖ´«¶¯×°ÖÃÊDzÉÃñÓÃÔÚÆðÖØ»ú´«Ê䣬ÄĸöÓÐÖ÷²ÎÊýÈçÏ£º¶î¶¨¹¦ÂÊPe=1.5kw¡¢Êä³öËÙ¶È28.4r£¯min¡¢Êä³öת¾ØT2=2 295.87n.m¡¢³ÝÂÖ±ÈU=49.3¡¢¹¤×÷¸ººÉÒòËØk=1.05, ÄÇÂÝÐýÊÇ»úÆ÷ºÍ¾­¼ÓÈÈ´¦Àí²Ä45¸ÖºÍÄÇÓÉZQA19-4¹¹³ÉµÄ³ÝÂֵijÝÂÖ¹Ú. AÖ¸¶¨Ä¿±êº¯Êý

ΪÁ˽ÚÊ¡ÓÐÉ«½ðÊôµÄ³ÝÂÖ¹ÚµÄÂÝÐý³ÝÂÖ£¬ÄÇÄ¿±êº¯Êý½«Ó¦Ö¸¶¨ÄÇÄÇ´óÁ¿µÄ³ÝÂÖ¹ÚµÄÂÝÐý³ÝÂÖÔÚÇ£Òý»ú¹¹Ïò×îСµÄ°´ÕÕͼ1Çãб£¬d0¡¢ di2,b·Ö±ðÊÇÍ⾶¡¢ÄÚ¾¶ºÍÂÝÐý³ÝÂֵijÝÃæ¿í¹Ú£¬Òò´ËÄÇÊÇ´óÁ¿µÄÑÀ³Ý¹Ú£»

ÓÉ

ËùÒÔÄÇÄ¿±êº¯ÊýÊÇ

m³ÝÂÖÄ£Êý£»d1³ÝÂÖ·Ö¶ÈÔ²Ö±¾¶£»z1ÂÝÐý¿ªÊ¼µÄ³ÝÊý¡£

ͼ1.ÎÐÂÖ´«¶¯×°ÖÃͼ

B·´ÃæÑ¡ÔñÉè¼Æ²ÎÊý°´ÕÕµÈʽµÄÄÇÄ¿±êº¯Êý¡¢,m¡¢d1½«Ó¦ËäÈ»Éè¼Æ²ÎÊýÑ¡Ôñ£¬µ«ÊǼò¶øÑÔÖ®£º

C½¨Á¢Ä£ºýÔ¼Êø

ÈÏΪ¦µÖµµÄËæ»úÌØÐÔÉè¼Æ²ÎÊýºÍһЩÒòËØË­µÄ¼ÛÖµºÜ²»¶¨µÄ±ÈÈ縺ºÉÐÔÖʺͲÄÁÏÆ·ÖÊ¡¢ÄÇÄ£ºýÔ¼ÊøÊǽ¨Á¢¡¢°üÀ¨ÄÇÐÔÖʺͱ߽çÔ¼ÊøÔÚÄÚ¡£ 1£©¼«Ï޵ĿªÊ¼µÄÂÝÐýµÄ³ÝÊý£ºz1=1~2;£» 2£©¼«Ï޵ijÝÂÖµÄÄ£Êý:2¡Üm¡Ü8£»

3£©¼«ÏÞµÄÄǵ¼³Ì½ÇÂÝÐýµÄÒòΪ±£Ö¤ÎÏÂÖ´«¶¯×°ÖõÄЧÂÊ£º3¡Ü¦Ã¡Ü8,tan ¦Ã=mz1/d1; 4£©Ô¼ÊøµÄ½Ó´¥Ç¿¶ÈµÄÂÝÐý³ÝÂÖ£º

ÄDzÄÁϵ¯ÐÔÒòËØ¡¢

¦ÒhÄǽӴ¥Ó¦Á¦µÄÂÝÐý³ÝÂÖ£»

[¦Òh]-ÄÇÄ£ºýµÄ¦µÖµÄÇÈÝÐí½Ó´¥Ó¦Á¦µÄÂÝÐý³ÝÂÖ¡£ 5£©Ô¼ÊøµÄÑÀ³ÝÁºÇ¿¶ÈµÄÂÝÐý³ÝÂÖ£º

ÄǺáÁºÇ¿µ÷µÄÂֳݣ»

[¦Òf]-Ä£ºýµÄ¦µÖµÄÇÈÝÐíÍäÇúÓ¦Á¦µÄÂÝÐý³ÝÂÖÑÀ³Ý£» Yf-ÄÇÂÖÀªÒòËØÒòΪÂÝÐý³ÝÂÖÑÀ³Ý¡£

£¶£©Ô¼Êø³íµÄµÄµÄÂÝÐý£ºÄÇÂÝÐýÐÅϢϵͳ֧³Ö

ÔÚ...Ö®¼ä¶þÖá³Ð¡¢Èç¹ûÄÇÎϸËÖáÍäÇú¶à£¬ÄǾÍÊÇ˵£¬ÄÇÑÀ³Ý²»»áÊʵ±µØÍø¿×£¬ÄÇô£¬