النتائج 16 إلى 30 من 32
الموضوع: هل ينجح هذا علي الحقيقي؟
- 04-01-2012, 09:06 PM #16
وعلي فكرة توجد مواضيع بمواقع أجنبية تحتاج أن يتفرغ لها واحد بأقرب فرصة لعل يكون هناك طريقة ممكن عمل بها أكسبرت ناجح ونحن لا ندري
فوجدت مثلا موضوع يتكلم عن التوقع بأستعمال الشبكات العصبية ومعه أكواد جاهزة بلغة السي يعني وفروا حتي وقت البرمجة وأنا ليس لدي فكرة عن أستعمال الشبكة للتوقع وكل ماعملته هو أني قرأت كتاب بالعربي عن المعادلات التي تستعمل بشبكة بها 3 طبقات وحولتها للغة سي مع العلم بأنه يوجد أنواع كثيرة وأنا أستعملت أبسط شبكة
وهذا مثال عن ما وجدت
Backpropagation Network
Time-Series Forecasting
Prediction of the Annual Number of Sunspots
This program implements the now classic multi-layer backpropagation network with bias terms and momentum. It is used to detect structure in time-series, which is presented to the network using a simple tapped delay-line memory. The program learns to predict future sunspot activity from historical data collected over the past three centuries. To avoid overfitting, the termination of the learning procedure is controlled by the so-called stopped training method.
كود:Neural Networks Source Codes - The Backpropagation Network -------------------------------------------------------------------------------- The Backpropagation Network -------------------------------------------------------------------------------- This program is copyright (c) 1996 by the author. It is made available as is, and no warranty - about the program, its performance, or its conformity to any specification - is given or implied. It may be used, modified, and distributed freely for private and commercial purposes, as long as the original author is credited as part of the final work. Backpropagation Network Simulator /****************************************************************************** ==================================================== Network: Backpropagation Network with Bias Terms and Momentum ==================================================== Application: Time-Series Forecasting Prediction of the Annual Number of Sunspots Author: Karsten Kutza Date: 17.4.96 Reference: D.E. Rumelhart, G.E. Hinton, R.J. Williams Learning Internal Representations by Error Propagation in: D.E. Rumelhart, J.L. McClelland (Eds.) Parallel Distributed Processing, Volume 1 MIT Press, Cambridge, MA, pp. 318-362, 1986 ******************************************************************************/ /****************************************************************************** D E C L A R A T I O N S ******************************************************************************/ #include <stdlib.h> #include <stdio.h> #include <math.h> typedef int BOOL; typedef int INT; typedef double REAL; #define FALSE 0 #define TRUE 1 #define NOT ! #define AND && #define OR || #define MIN_REAL -HUGE_VAL #define MAX_REAL +HUGE_VAL #define MIN(x,y) ((x)<(y) ? (x) : (y)) #define MAX(x,y) ((x)>(y) ? (x) : (y)) #define LO 0.1 #define HI 0.9 #define BIAS 1 #define sqr(x) ((x)*(x)) typedef struct { /* A LAYER OF A NET: */ INT Units; /* - number of units in this layer */ REAL* Output; /* - output of ith unit */ REAL* Error; /* - error term of ith unit */ REAL** Weight; /* - connection weights to ith unit */ REAL** WeightSave; /* - saved weights for stopped training */ REAL** dWeight; /* - last weight deltas for momentum */ } LAYER; typedef struct { /* A NET: */ LAYER** Layer; /* - layers of this net */ LAYER* InputLayer; /* - input layer */ LAYER* OutputLayer; /* - output layer */ REAL Alpha; /* - momentum factor */ REAL Eta; /* - learning rate */ REAL Gain; /* - gain of sigmoid function */ REAL Error; /* - total net error */ } NET; /****************************************************************************** R A N D O M S D R A W N F R O M D I S T R I B U T I O N S ******************************************************************************/ void InitializeRandoms() { srand(4711); } INT RandomEqualINT(INT Low, INT High) { return rand() % (High-Low+1) + Low; } REAL RandomEqualREAL(REAL Low, REAL High) { return ((REAL) rand() / RAND_MAX) * (High-Low) + Low; } /****************************************************************************** A P P L I C A T I O N - S P E C I F I C C O D E ******************************************************************************/ #define NUM_LAYERS 3 #define N 30 #define M 1 INT Units[NUM_LAYERS] = {N, 10, M}; #define FIRST_YEAR 1700 #define NUM_YEARS 280 #define TRAIN_LWB (N) #define TRAIN_UPB (179) #define TRAIN_YEARS (TRAIN_UPB - TRAIN_LWB + 1) #define TEST_LWB (180) #define TEST_UPB (259) #define TEST_YEARS (TEST_UPB - TEST_LWB + 1) #define EVAL_LWB (260) #define EVAL_UPB (NUM_YEARS - 1) #define EVAL_YEARS (EVAL_UPB - EVAL_LWB + 1) REAL Sunspots_[NUM_YEARS]; REAL Sunspots [NUM_YEARS] = { 0.0262, 0.0575, 0.0837, 0.1203, 0.1883, 0.3033, 0.1517, 0.1046, 0.0523, 0.0418, 0.0157, 0.0000, 0.0000, 0.0105, 0.0575, 0.1412, 0.2458, 0.3295, 0.3138, 0.2040, 0.1464, 0.1360, 0.1151, 0.0575, 0.1098, 0.2092, 0.4079, 0.6381, 0.5387, 0.3818, 0.2458, 0.1831, 0.0575, 0.0262, 0.0837, 0.1778, 0.3661, 0.4236, 0.5805, 0.5282, 0.3818, 0.2092, 0.1046, 0.0837, 0.0262, 0.0575, 0.1151, 0.2092, 0.3138, 0.4231, 0.4362, 0.2495, 0.2500, 0.1606, 0.0638, 0.0502, 0.0534, 0.1700, 0.2489, 0.2824, 0.3290, 0.4493, 0.3201, 0.2359, 0.1904, 0.1093, 0.0596, 0.1977, 0.3651, 0.5549, 0.5272, 0.4268, 0.3478, 0.1820, 0.1600, 0.0366, 0.1036, 0.4838, 0.8075, 0.6585, 0.4435, 0.3562, 0.2014, 0.1192, 0.0534, 0.1260, 0.4336, 0.6904, 0.6846, 0.6177, 0.4702, 0.3483, 0.3138, 0.2453, 0.2144, 0.1114, 0.0837, 0.0335, 0.0214, 0.0356, 0.0758, 0.1778, 0.2354, 0.2254, 0.2484, 0.2207, 0.1470, 0.0528, 0.0424, 0.0131, 0.0000, 0.0073, 0.0262, 0.0638, 0.0727, 0.1851, 0.2395, 0.2150, 0.1574, 0.1250, 0.0816, 0.0345, 0.0209, 0.0094, 0.0445, 0.0868, 0.1898, 0.2594, 0.3358, 0.3504, 0.3708, 0.2500, 0.1438, 0.0445, 0.0690, 0.2976, 0.6354, 0.7233, 0.5397, 0.4482, 0.3379, 0.1919, 0.1266, 0.0560, 0.0785, 0.2097, 0.3216, 0.5152, 0.6522, 0.5036, 0.3483, 0.3373, 0.2829, 0.2040, 0.1077, 0.0350, 0.0225, 0.1187, 0.2866, 0.4906, 0.5010, 0.4038, 0.3091, 0.2301, 0.2458, 0.1595, 0.0853, 0.0382, 0.1966, 0.3870, 0.7270, 0.5816, 0.5314, 0.3462, 0.2338, 0.0889, 0.0591, 0.0649, 0.0178, 0.0314, 0.1689, 0.2840, 0.3122, 0.3332, 0.3321, 0.2730, 0.1328, 0.0685, 0.0356, 0.0330, 0.0371, 0.1862, 0.3818, 0.4451, 0.4079, 0.3347, 0.2186, 0.1370, 0.1396, 0.0633, 0.0497, 0.0141, 0.0262, 0.1276, 0.2197, 0.3321, 0.2814, 0.3243, 0.2537, 0.2296, 0.0973, 0.0298, 0.0188, 0.0073, 0.0502, 0.2479, 0.2986, 0.5434, 0.4215, 0.3326, 0.1966, 0.1365, 0.0743, 0.0303, 0.0873, 0.2317, 0.3342, 0.3609, 0.4069, 0.3394, 0.1867, 0.1109, 0.0581, 0.0298, 0.0455, 0.1888, 0.4168, 0.5983, 0.5732, 0.4644, 0.3546, 0.2484, 0.1600, 0.0853, 0.0502, 0.1736, 0.4843, 0.7929, 0.7128, 0.7045, 0.4388, 0.3630, 0.1647, 0.0727, 0.0230, 0.1987, 0.7411, 0.9947, 0.9665, 0.8316, 0.5873, 0.2819, 0.1961, 0.1459, 0.0534, 0.0790, 0.2458, 0.4906, 0.5539, 0.5518, 0.5465, 0.3483, 0.3603, 0.1987, 0.1804, 0.0811, 0.0659, 0.1428, 0.4838, 0.8127 }; REAL Mean; REAL TrainError; REAL TrainErrorPredictingMean; REAL TestError; REAL TestErrorPredictingMean; FILE* f; void NormalizeSunspots() { INT Year; REAL Min, Max; Min = MAX_REAL; Max = MIN_REAL; for (Year=0; Year<NUM_YEARS; Year++) { Min = MIN(Min, Sunspots[Year]); Max = MAX(Max, Sunspots[Year]); } Mean = 0; for (Year=0; Year<NUM_YEARS; Year++) { Sunspots_[Year] = Sunspots [Year] = ((Sunspots[Year]-Min) / (Max-Min)) * (HI-LO) + LO; Mean += Sunspots[Year] / NUM_YEARS; } } void InitializeApplication(NET* Net) { INT Year, i; REAL Out, Err; Net->Alpha = 0.5; Net->Eta = 0.05; Net->Gain = 1; NormalizeSunspots(); TrainErrorPredictingMean = 0; for (Year=TRAIN_LWB; Year<=TRAIN_UPB; Year++) { for (i=0; i<M; i++) { Out = Sunspots[Year+i]; Err = Mean - Out; TrainErrorPredictingMean += 0.5 * sqr(Err); } } TestErrorPredictingMean = 0; for (Year=TEST_LWB; Year<=TEST_UPB; Year++) { for (i=0; i<M; i++) { Out = Sunspots[Year+i]; Err = Mean - Out; TestErrorPredictingMean += 0.5 * sqr(Err); } } f = fopen("BPN.txt", "w"); } void FinalizeApplication(NET* Net) { fclose(f); } /****************************************************************************** I N I T I A L I Z A T I O N ******************************************************************************/ void GenerateNetwork(NET* Net) { INT l,i; Net->Layer = (LAYER**) calloc(NUM_LAYERS, sizeof(LAYER*)); for (l=0; l<NUM_LAYERS; l++) { Net->Layer[l] = (LAYER*) malloc(sizeof(LAYER)); Net->Layer[l]->Units = Units[l]; Net->Layer[l]->Output = (REAL*) calloc(Units[l]+1, sizeof(REAL)); Net->Layer[l]->Error = (REAL*) calloc(Units[l]+1, sizeof(REAL)); Net->Layer[l]->Weight = (REAL**) calloc(Units[l]+1, sizeof(REAL*)); Net->Layer[l]->WeightSave = (REAL**) calloc(Units[l]+1, sizeof(REAL*)); Net->Layer[l]->dWeight = (REAL**) calloc(Units[l]+1, sizeof(REAL*)); Net->Layer[l]->Output[0] = BIAS; if (l != 0) { for (i=1; i<=Units[l]; i++) { Net->Layer[l]->Weight[i] = (REAL*) calloc(Units[l-1]+1, sizeof(REAL)); Net->Layer[l]->WeightSave[i] = (REAL*) calloc(Units[l-1]+1, sizeof(REAL)); Net->Layer[l]->dWeight[i] = (REAL*) calloc(Units[l-1]+1, sizeof(REAL)); } } } Net->InputLayer = Net->Layer[0]; Net->OutputLayer = Net->Layer[NUM_LAYERS - 1]; Net->Alpha = 0.9; Net->Eta = 0.25; Net->Gain = 1; } void RandomWeights(NET* Net) { INT l,i,j; for (l=1; l<NUM_LAYERS; l++) { for (i=1; i<=Net->Layer[l]->Units; i++) { for (j=0; j<=Net->Layer[l-1]->Units; j++) { Net->Layer[l]->Weight[i][j] = RandomEqualREAL(-0.5, 0.5); } } } } void SetInput(NET* Net, REAL* Input) { INT i; for (i=1; i<=Net->InputLayer->Units; i++) { Net->InputLayer->Output[i] = Input[i-1]; } } void GetOutput(NET* Net, REAL* Output) { INT i; for (i=1; i<=Net->OutputLayer->Units; i++) { Output[i-1] = Net->OutputLayer->Output[i]; } } /****************************************************************************** S U P P O R T F O R S T O P P E D T R A I N I N G ******************************************************************************/ void SaveWeights(NET* Net) { INT l,i,j; for (l=1; l<NUM_LAYERS; l++) { for (i=1; i<=Net->Layer[l]->Units; i++) { for (j=0; j<=Net->Layer[l-1]->Units; j++) { Net->Layer[l]->WeightSave[i][j] = Net->Layer[l]->Weight[i][j]; } } } } void RestoreWeights(NET* Net) { INT l,i,j; for (l=1; l<NUM_LAYERS; l++) { for (i=1; i<=Net->Layer[l]->Units; i++) { for (j=0; j<=Net->Layer[l-1]->Units; j++) { Net->Layer[l]->Weight[i][j] = Net->Layer[l]->WeightSave[i][j]; } } } } /****************************************************************************** P R O P A G A T I N G S I G N A L S ******************************************************************************/ void PropagateLayer(NET* Net, LAYER* Lower, LAYER* Upper) { INT i,j; REAL Sum; for (i=1; i<=Upper->Units; i++) { Sum = 0; for (j=0; j<=Lower->Units; j++) { Sum += Upper->Weight[i][j] * Lower->Output[j]; } Upper->Output[i] = 1 / (1 + exp(-Net->Gain * Sum)); } } void PropagateNet(NET* Net) { INT l; for (l=0; l<NUM_LAYERS-1; l++) { PropagateLayer(Net, Net->Layer[l], Net->Layer[l+1]); } } /****************************************************************************** B A C K P R O P A G A T I N G E R R O R S ******************************************************************************/ void ComputeOutputError(NET* Net, REAL* Target) { INT i; REAL Out, Err; Net->Error = 0; for (i=1; i<=Net->OutputLayer->Units; i++) { Out = Net->OutputLayer->Output[i]; Err = Target[i-1]-Out; Net->OutputLayer->Error[i] = Net->Gain * Out * (1-Out) * Err; Net->Error += 0.5 * sqr(Err); } } void BackpropagateLayer(NET* Net, LAYER* Upper, LAYER* Lower) { INT i,j; REAL Out, Err; for (i=1; i<=Lower->Units; i++) { Out = Lower->Output[i]; Err = 0; for (j=1; j<=Upper->Units; j++) { Err += Upper->Weight[j][i] * Upper->Error[j]; } Lower->Error[i] = Net->Gain * Out * (1-Out) * Err; } } void BackpropagateNet(NET* Net) { INT l; for (l=NUM_LAYERS-1; l>1; l--) { BackpropagateLayer(Net, Net->Layer[l], Net->Layer[l-1]); } } void AdjustWeights(NET* Net) { INT l,i,j; REAL Out, Err, dWeight; for (l=1; l<NUM_LAYERS; l++) { for (i=1; i<=Net->Layer[l]->Units; i++) { for (j=0; j<=Net->Layer[l-1]->Units; j++) { Out = Net->Layer[l-1]->Output[j]; Err = Net->Layer[l]->Error[i]; dWeight = Net->Layer[l]->dWeight[i][j]; Net->Layer[l]->Weight[i][j] += Net->Eta * Err * Out + Net->Alpha * dWeight; Net->Layer[l]->dWeight[i][j] = Net->Eta * Err * Out; } } } } /****************************************************************************** S I M U L A T I N G T H E N E T ******************************************************************************/ void SimulateNet(NET* Net, REAL* Input, REAL* Output, REAL* Target, BOOL Training) { SetInput(Net, Input); PropagateNet(Net); GetOutput(Net, Output); ComputeOutputError(Net, Target); if (Training) { BackpropagateNet(Net); AdjustWeights(Net); } } void TrainNet(NET* Net, INT Epochs) { INT Year, n; REAL Output[M]; for (n=0; n<Epochs*TRAIN_YEARS; n++) { Year = RandomEqualINT(TRAIN_LWB, TRAIN_UPB); SimulateNet(Net, &(Sunspots[Year-N]), Output, &(Sunspots[Year]), TRUE); } } void TestNet(NET* Net) { INT Year; REAL Output[M]; TrainError = 0; for (Year=TRAIN_LWB; Year<=TRAIN_UPB; Year++) { SimulateNet(Net, &(Sunspots[Year-N]), Output, &(Sunspots[Year]), FALSE); TrainError += Net->Error; } TestError = 0; for (Year=TEST_LWB; Year<=TEST_UPB; Year++) { SimulateNet(Net, &(Sunspots[Year-N]), Output, &(Sunspots[Year]), FALSE); TestError += Net->Error; } fprintf(f, "\nNMSE is %0.3f on Training Set and %0.3f on Test Set", TrainError / TrainErrorPredictingMean, TestError / TestErrorPredictingMean); } void EvaluateNet(NET* Net) { INT Year; REAL Output [M]; REAL Output_[M]; fprintf(f, "\n\n\n"); fprintf(f, "Year Sunspots Open-Loop Prediction Closed-Loop Prediction\n"); fprintf(f, "\n"); for (Year=EVAL_LWB; Year<=EVAL_UPB; Year++) { SimulateNet(Net, &(Sunspots [Year-N]), Output, &(Sunspots [Year]), FALSE); SimulateNet(Net, &(Sunspots_[Year-N]), Output_, &(Sunspots_[Year]), FALSE); Sunspots_[Year] = Output_[0]; fprintf(f, "%d %0.3f %0.3f %0.3f\n", FIRST_YEAR + Year, Sunspots[Year], Output [0], Output_[0]); } } /****************************************************************************** M A I N ******************************************************************************/ void main() { NET Net; BOOL Stop; REAL MinTestError; InitializeRandoms(); GenerateNetwork(&Net); RandomWeights(&Net); InitializeApplication(&Net); Stop = FALSE; MinTestError = MAX_REAL; do { TrainNet(&Net, 10); TestNet(&Net); if (TestError < MinTestError) { fprintf(f, " - saving Weights ..."); MinTestError = TestError; SaveWeights(&Net); } else if (TestError > 1.2 * MinTestError) { fprintf(f, " - stopping Training and restoring Weights ..."); Stop = TRUE; RestoreWeights(&Net); } } while (NOT Stop); TestNet(&Net); EvaluateNet(&Net); FinalizeApplication(&Net); } Simulator Output for the Time-Series Forecasting Application NMSE is 0.879 on Training Set and 0.834 on Test Set - saving Weights ... NMSE is 0.818 on Training Set and 0.783 on Test Set - saving Weights ... NMSE is 0.749 on Training Set and 0.693 on Test Set - saving Weights ... NMSE is 0.691 on Training Set and 0.614 on Test Set - saving Weights ... NMSE is 0.622 on Training Set and 0.555 on Test Set - saving Weights ... NMSE is 0.569 on Training Set and 0.491 on Test Set - saving Weights ... NMSE is 0.533 on Training Set and 0.467 on Test Set - saving Weights ... NMSE is 0.490 on Training Set and 0.416 on Test Set - saving Weights ... NMSE is 0.470 on Training Set and 0.401 on Test Set - saving Weights ... NMSE is 0.441 on Training Set and 0.361 on Test Set - saving Weights ... . . . NMSE is 0.142 on Training Set and 0.143 on Test Set NMSE is 0.142 on Training Set and 0.146 on Test Set NMSE is 0.141 on Training Set and 0.143 on Test Set NMSE is 0.146 on Training Set and 0.141 on Test Set NMSE is 0.144 on Training Set and 0.141 on Test Set NMSE is 0.140 on Training Set and 0.142 on Test Set NMSE is 0.144 on Training Set and 0.148 on Test Set NMSE is 0.140 on Training Set and 0.139 on Test Set - saving Weights ... NMSE is 0.140 on Training Set and 0.140 on Test Set NMSE is 0.141 on Training Set and 0.138 on Test Set - saving Weights ... . . . NMSE is 0.104 on Training Set and 0.154 on Test Set NMSE is 0.102 on Training Set and 0.160 on Test Set NMSE is 0.102 on Training Set and 0.160 on Test Set NMSE is 0.100 on Training Set and 0.157 on Test Set NMSE is 0.105 on Training Set and 0.153 on Test Set NMSE is 0.100 on Training Set and 0.155 on Test Set NMSE is 0.101 on Training Set and 0.154 on Test Set NMSE is 0.100 on Training Set and 0.158 on Test Set NMSE is 0.107 on Training Set and 0.170 on Test Set - stopping Training and restoring Weights ... NMSE is 0.141 on Training Set and 0.138 on Test Set Year Sunspots Open-Loop Prediction Closed-Loop Prediction 1960 0.572 0.532 0.532 1961 0.327 0.334 0.301 1962 0.258 0.158 0.146 1963 0.217 0.156 0.098 1964 0.143 0.236 0.149 1965 0.164 0.230 0.273 1966 0.298 0.263 0.405 1967 0.495 0.454 0.552 1968 0.545 0.615 0.627 1969 0.544 0.550 0.589 1970 0.540 0.474 0.464 1971 0.380 0.455 0.305 1972 0.390 0.270 0.191 1973 0.260 0.275 0.139 1974 0.245 0.211 0.158 1975 0.165 0.181 0.170 1976 0.153 0.128 0.175 1977 0.215 0.151 0.193 1978 0.489 0.316 0.274 1979 0.754 0.622 0.373
النتائج
كود:NMSE is 0.879 on Training Set and 0.834 on Test Set - saving Weights ... NMSE is 0.818 on Training Set and 0.783 on Test Set - saving Weights ... NMSE is 0.749 on Training Set and 0.693 on Test Set - saving Weights ... NMSE is 0.691 on Training Set and 0.614 on Test Set - saving Weights ... NMSE is 0.622 on Training Set and 0.555 on Test Set - saving Weights ... NMSE is 0.569 on Training Set and 0.491 on Test Set - saving Weights ... NMSE is 0.533 on Training Set and 0.467 on Test Set - saving Weights ... NMSE is 0.490 on Training Set and 0.416 on Test Set - saving Weights ... NMSE is 0.470 on Training Set and 0.401 on Test Set - saving Weights ... NMSE is 0.441 on Training Set and 0.361 on Test Set - saving Weights ... NMSE is 0.423 on Training Set and 0.345 on Test Set - saving Weights ... NMSE is 0.402 on Training Set and 0.329 on Test Set - saving Weights ... NMSE is 0.385 on Training Set and 0.319 on Test Set - saving Weights ... NMSE is 0.381 on Training Set and 0.323 on Test Set NMSE is 0.356 on Training Set and 0.292 on Test Set - saving Weights ... NMSE is 0.350 on Training Set and 0.279 on Test Set - saving Weights ... NMSE is 0.333 on Training Set and 0.272 on Test Set - saving Weights ... NMSE is 0.322 on Training Set and 0.258 on Test Set - saving Weights ... NMSE is 0.318 on Training Set and 0.250 on Test Set - saving Weights ... NMSE is 0.303 on Training Set and 0.244 on Test Set - saving Weights ... NMSE is 0.303 on Training Set and 0.235 on Test Set - saving Weights ... NMSE is 0.290 on Training Set and 0.238 on Test Set NMSE is 0.281 on Training Set and 0.224 on Test Set - saving Weights ... NMSE is 0.275 on Training Set and 0.215 on Test Set - saving Weights ... NMSE is 0.275 on Training Set and 0.208 on Test Set - saving Weights ... NMSE is 0.264 on Training Set and 0.209 on Test Set NMSE is 0.259 on Training Set and 0.207 on Test Set - saving Weights ... NMSE is 0.255 on Training Set and 0.203 on Test Set - saving Weights ... NMSE is 0.251 on Training Set and 0.202 on Test Set - saving Weights ... NMSE is 0.247 on Training Set and 0.198 on Test Set - saving Weights ... NMSE is 0.246 on Training Set and 0.203 on Test Set NMSE is 0.243 on Training Set and 0.191 on Test Set - saving Weights ... NMSE is 0.241 on Training Set and 0.198 on Test Set NMSE is 0.233 on Training Set and 0.185 on Test Set - saving Weights ... NMSE is 0.235 on Training Set and 0.184 on Test Set - saving Weights ... NMSE is 0.232 on Training Set and 0.192 on Test Set NMSE is 0.224 on Training Set and 0.183 on Test Set - saving Weights ... NMSE is 0.222 on Training Set and 0.179 on Test Set - saving Weights ... NMSE is 0.219 on Training Set and 0.179 on Test Set - saving Weights ... NMSE is 0.225 on Training Set and 0.188 on Test Set NMSE is 0.213 on Training Set and 0.172 on Test Set - saving Weights ... NMSE is 0.217 on Training Set and 0.171 on Test Set - saving Weights ... NMSE is 0.208 on Training Set and 0.169 on Test Set - saving Weights ... NMSE is 0.207 on Training Set and 0.170 on Test Set NMSE is 0.204 on Training Set and 0.167 on Test Set - saving Weights ... NMSE is 0.203 on Training Set and 0.172 on Test Set NMSE is 0.200 on Training Set and 0.170 on Test Set NMSE is 0.199 on Training Set and 0.172 on Test Set NMSE is 0.201 on Training Set and 0.165 on Test Set - saving Weights ... NMSE is 0.195 on Training Set and 0.164 on Test Set - saving Weights ... NMSE is 0.200 on Training Set and 0.162 on Test Set - saving Weights ... NMSE is 0.193 on Training Set and 0.164 on Test Set NMSE is 0.195 on Training Set and 0.167 on Test Set NMSE is 0.190 on Training Set and 0.162 on Test Set - saving Weights ... NMSE is 0.188 on Training Set and 0.155 on Test Set - saving Weights ... NMSE is 0.189 on Training Set and 0.161 on Test Set NMSE is 0.189 on Training Set and 0.163 on Test Set NMSE is 0.184 on Training Set and 0.155 on Test Set - saving Weights ... NMSE is 0.196 on Training Set and 0.175 on Test Set NMSE is 0.183 on Training Set and 0.160 on Test Set NMSE is 0.184 on Training Set and 0.164 on Test Set NMSE is 0.181 on Training Set and 0.161 on Test Set NMSE is 0.180 on Training Set and 0.159 on Test Set NMSE is 0.187 on Training Set and 0.155 on Test Set NMSE is 0.178 on Training Set and 0.159 on Test Set NMSE is 0.176 on Training Set and 0.153 on Test Set - saving Weights ... NMSE is 0.177 on Training Set and 0.160 on Test Set NMSE is 0.177 on Training Set and 0.153 on Test Set - saving Weights ... NMSE is 0.179 on Training Set and 0.152 on Test Set - saving Weights ... NMSE is 0.173 on Training Set and 0.152 on Test Set - saving Weights ... NMSE is 0.171 on Training Set and 0.155 on Test Set NMSE is 0.174 on Training Set and 0.152 on Test Set - saving Weights ... NMSE is 0.170 on Training Set and 0.154 on Test Set NMSE is 0.170 on Training Set and 0.156 on Test Set NMSE is 0.170 on Training Set and 0.156 on Test Set NMSE is 0.169 on Training Set and 0.151 on Test Set - saving Weights ... NMSE is 0.167 on Training Set and 0.152 on Test Set NMSE is 0.167 on Training Set and 0.152 on Test Set NMSE is 0.166 on Training Set and 0.156 on Test Set NMSE is 0.172 on Training Set and 0.153 on Test Set NMSE is 0.166 on Training Set and 0.155 on Test Set NMSE is 0.172 on Training Set and 0.154 on Test Set NMSE is 0.168 on Training Set and 0.151 on Test Set NMSE is 0.164 on Training Set and 0.148 on Test Set - saving Weights ... NMSE is 0.164 on Training Set and 0.152 on Test Set NMSE is 0.162 on Training Set and 0.148 on Test Set - saving Weights ... NMSE is 0.165 on Training Set and 0.154 on Test Set NMSE is 0.161 on Training Set and 0.146 on Test Set - saving Weights ... NMSE is 0.160 on Training Set and 0.149 on Test Set NMSE is 0.160 on Training Set and 0.149 on Test Set NMSE is 0.159 on Training Set and 0.146 on Test Set - saving Weights ... NMSE is 0.160 on Training Set and 0.148 on Test Set NMSE is 0.158 on Training Set and 0.145 on Test Set - saving Weights ... NMSE is 0.157 on Training Set and 0.144 on Test Set - saving Weights ... NMSE is 0.160 on Training Set and 0.141 on Test Set - saving Weights ... NMSE is 0.157 on Training Set and 0.144 on Test Set NMSE is 0.159 on Training Set and 0.150 on Test Set NMSE is 0.157 on Training Set and 0.144 on Test Set NMSE is 0.157 on Training Set and 0.150 on Test Set NMSE is 0.156 on Training Set and 0.150 on Test Set NMSE is 0.154 on Training Set and 0.144 on Test Set NMSE is 0.154 on Training Set and 0.146 on Test Set NMSE is 0.155 on Training Set and 0.149 on Test Set NMSE is 0.154 on Training Set and 0.148 on Test Set NMSE is 0.152 on Training Set and 0.144 on Test Set NMSE is 0.153 on Training Set and 0.145 on Test Set NMSE is 0.151 on Training Set and 0.143 on Test Set NMSE is 0.151 on Training Set and 0.145 on Test Set NMSE is 0.152 on Training Set and 0.143 on Test Set NMSE is 0.152 on Training Set and 0.147 on Test Set NMSE is 0.151 on Training Set and 0.141 on Test Set - saving Weights ... NMSE is 0.154 on Training Set and 0.141 on Test Set NMSE is 0.152 on Training Set and 0.147 on Test Set NMSE is 0.150 on Training Set and 0.146 on Test Set NMSE is 0.149 on Training Set and 0.146 on Test Set NMSE is 0.150 on Training Set and 0.150 on Test Set NMSE is 0.148 on Training Set and 0.143 on Test Set NMSE is 0.150 on Training Set and 0.147 on Test Set NMSE is 0.147 on Training Set and 0.144 on Test Set NMSE is 0.153 on Training Set and 0.142 on Test Set NMSE is 0.147 on Training Set and 0.144 on Test Set NMSE is 0.146 on Training Set and 0.144 on Test Set NMSE is 0.147 on Training Set and 0.141 on Test Set NMSE is 0.145 on Training Set and 0.141 on Test Set - saving Weights ... NMSE is 0.145 on Training Set and 0.141 on Test Set - saving Weights ... NMSE is 0.145 on Training Set and 0.140 on Test Set - saving Weights ... NMSE is 0.145 on Training Set and 0.142 on Test Set NMSE is 0.147 on Training Set and 0.140 on Test Set - saving Weights ... NMSE is 0.150 on Training Set and 0.140 on Test Set NMSE is 0.150 on Training Set and 0.141 on Test Set NMSE is 0.143 on Training Set and 0.144 on Test Set NMSE is 0.143 on Training Set and 0.142 on Test Set NMSE is 0.142 on Training Set and 0.141 on Test Set NMSE is 0.142 on Training Set and 0.143 on Test Set NMSE is 0.142 on Training Set and 0.143 on Test Set NMSE is 0.142 on Training Set and 0.146 on Test Set NMSE is 0.141 on Training Set and 0.143 on Test Set NMSE is 0.146 on Training Set and 0.141 on Test Set NMSE is 0.144 on Training Set and 0.141 on Test Set NMSE is 0.140 on Training Set and 0.142 on Test Set NMSE is 0.144 on Training Set and 0.148 on Test Set NMSE is 0.140 on Training Set and 0.139 on Test Set - saving Weights ... NMSE is 0.140 on Training Set and 0.140 on Test Set NMSE is 0.141 on Training Set and 0.138 on Test Set - saving Weights ... NMSE is 0.139 on Training Set and 0.140 on Test Set NMSE is 0.138 on Training Set and 0.141 on Test Set NMSE is 0.138 on Training Set and 0.140 on Test Set NMSE is 0.138 on Training Set and 0.141 on Test Set NMSE is 0.137 on Training Set and 0.141 on Test Set NMSE is 0.139 on Training Set and 0.140 on Test Set NMSE is 0.140 on Training Set and 0.140 on Test Set NMSE is 0.137 on Training Set and 0.142 on Test Set NMSE is 0.137 on Training Set and 0.144 on Test Set NMSE is 0.137 on Training Set and 0.141 on Test Set NMSE is 0.144 on Training Set and 0.142 on Test Set NMSE is 0.140 on Training Set and 0.151 on Test Set NMSE is 0.135 on Training Set and 0.142 on Test Set NMSE is 0.138 on Training Set and 0.142 on Test Set NMSE is 0.135 on Training Set and 0.144 on Test Set NMSE is 0.134 on Training Set and 0.144 on Test Set NMSE is 0.134 on Training Set and 0.141 on Test Set NMSE is 0.134 on Training Set and 0.145 on Test Set NMSE is 0.134 on Training Set and 0.142 on Test Set NMSE is 0.134 on Training Set and 0.144 on Test Set NMSE is 0.133 on Training Set and 0.142 on Test Set NMSE is 0.133 on Training Set and 0.143 on Test Set NMSE is 0.132 on Training Set and 0.145 on Test Set NMSE is 0.133 on Training Set and 0.146 on Test Set NMSE is 0.132 on Training Set and 0.146 on Test Set NMSE is 0.132 on Training Set and 0.144 on Test Set NMSE is 0.132 on Training Set and 0.147 on Test Set NMSE is 0.132 on Training Set and 0.149 on Test Set NMSE is 0.132 on Training Set and 0.148 on Test Set NMSE is 0.136 on Training Set and 0.154 on Test Set NMSE is 0.131 on Training Set and 0.143 on Test Set NMSE is 0.132 on Training Set and 0.148 on Test Set NMSE is 0.137 on Training Set and 0.141 on Test Set NMSE is 0.129 on Training Set and 0.143 on Test Set NMSE is 0.130 on Training Set and 0.142 on Test Set NMSE is 0.129 on Training Set and 0.147 on Test Set NMSE is 0.129 on Training Set and 0.146 on Test Set NMSE is 0.129 on Training Set and 0.148 on Test Set NMSE is 0.128 on Training Set and 0.145 on Test Set NMSE is 0.131 on Training Set and 0.149 on Test Set NMSE is 0.128 on Training Set and 0.142 on Test Set NMSE is 0.134 on Training Set and 0.142 on Test Set NMSE is 0.128 on Training Set and 0.146 on Test Set NMSE is 0.127 on Training Set and 0.145 on Test Set NMSE is 0.127 on Training Set and 0.144 on Test Set NMSE is 0.127 on Training Set and 0.146 on Test Set NMSE is 0.126 on Training Set and 0.145 on Test Set NMSE is 0.126 on Training Set and 0.148 on Test Set NMSE is 0.131 on Training Set and 0.156 on Test Set NMSE is 0.126 on Training Set and 0.148 on Test Set NMSE is 0.125 on Training Set and 0.147 on Test Set NMSE is 0.138 on Training Set and 0.147 on Test Set NMSE is 0.128 on Training Set and 0.154 on Test Set NMSE is 0.124 on Training Set and 0.146 on Test Set NMSE is 0.126 on Training Set and 0.144 on Test Set NMSE is 0.125 on Training Set and 0.150 on Test Set NMSE is 0.130 on Training Set and 0.146 on Test Set NMSE is 0.125 on Training Set and 0.145 on Test Set NMSE is 0.123 on Training Set and 0.146 on Test Set NMSE is 0.123 on Training Set and 0.147 on Test Set NMSE is 0.124 on Training Set and 0.150 on Test Set NMSE is 0.128 on Training Set and 0.146 on Test Set NMSE is 0.123 on Training Set and 0.146 on Test Set NMSE is 0.124 on Training Set and 0.151 on Test Set NMSE is 0.122 on Training Set and 0.146 on Test Set NMSE is 0.131 on Training Set and 0.146 on Test Set NMSE is 0.121 on Training Set and 0.146 on Test Set NMSE is 0.129 on Training Set and 0.159 on Test Set NMSE is 0.121 on Training Set and 0.146 on Test Set NMSE is 0.121 on Training Set and 0.144 on Test Set NMSE is 0.122 on Training Set and 0.151 on Test Set NMSE is 0.120 on Training Set and 0.147 on Test Set NMSE is 0.120 on Training Set and 0.147 on Test Set NMSE is 0.120 on Training Set and 0.147 on Test Set NMSE is 0.119 on Training Set and 0.147 on Test Set NMSE is 0.120 on Training Set and 0.152 on Test Set NMSE is 0.119 on Training Set and 0.148 on Test Set NMSE is 0.120 on Training Set and 0.148 on Test Set NMSE is 0.121 on Training Set and 0.156 on Test Set NMSE is 0.120 on Training Set and 0.148 on Test Set NMSE is 0.119 on Training Set and 0.146 on Test Set NMSE is 0.119 on Training Set and 0.152 on Test Set NMSE is 0.119 on Training Set and 0.152 on Test Set NMSE is 0.118 on Training Set and 0.153 on Test Set NMSE is 0.122 on Training Set and 0.146 on Test Set NMSE is 0.120 on Training Set and 0.146 on Test Set NMSE is 0.117 on Training Set and 0.152 on Test Set NMSE is 0.118 on Training Set and 0.148 on Test Set NMSE is 0.118 on Training Set and 0.155 on Test Set NMSE is 0.117 on Training Set and 0.148 on Test Set NMSE is 0.117 on Training Set and 0.148 on Test Set NMSE is 0.117 on Training Set and 0.148 on Test Set NMSE is 0.118 on Training Set and 0.156 on Test Set NMSE is 0.118 on Training Set and 0.157 on Test Set NMSE is 0.115 on Training Set and 0.153 on Test Set NMSE is 0.115 on Training Set and 0.150 on Test Set NMSE is 0.118 on Training Set and 0.157 on Test Set NMSE is 0.117 on Training Set and 0.157 on Test Set NMSE is 0.114 on Training Set and 0.149 on Test Set NMSE is 0.115 on Training Set and 0.148 on Test Set NMSE is 0.115 on Training Set and 0.149 on Test Set NMSE is 0.117 on Training Set and 0.151 on Test Set NMSE is 0.115 on Training Set and 0.151 on Test Set NMSE is 0.113 on Training Set and 0.150 on Test Set NMSE is 0.114 on Training Set and 0.155 on Test Set NMSE is 0.113 on Training Set and 0.149 on Test Set NMSE is 0.113 on Training Set and 0.151 on Test Set NMSE is 0.115 on Training Set and 0.148 on Test Set NMSE is 0.114 on Training Set and 0.148 on Test Set NMSE is 0.112 on Training Set and 0.152 on Test Set NMSE is 0.112 on Training Set and 0.152 on Test Set NMSE is 0.113 on Training Set and 0.154 on Test Set NMSE is 0.112 on Training Set and 0.151 on Test Set NMSE is 0.111 on Training Set and 0.150 on Test Set NMSE is 0.112 on Training Set and 0.153 on Test Set NMSE is 0.113 on Training Set and 0.148 on Test Set NMSE is 0.112 on Training Set and 0.149 on Test Set NMSE is 0.111 on Training Set and 0.149 on Test Set NMSE is 0.111 on Training Set and 0.149 on Test Set NMSE is 0.113 on Training Set and 0.155 on Test Set NMSE is 0.110 on Training Set and 0.151 on Test Set NMSE is 0.112 on Training Set and 0.158 on Test Set NMSE is 0.112 on Training Set and 0.149 on Test Set NMSE is 0.115 on Training Set and 0.163 on Test Set NMSE is 0.109 on Training Set and 0.152 on Test Set NMSE is 0.112 on Training Set and 0.150 on Test Set NMSE is 0.109 on Training Set and 0.154 on Test Set NMSE is 0.113 on Training Set and 0.160 on Test Set NMSE is 0.109 on Training Set and 0.157 on Test Set NMSE is 0.109 on Training Set and 0.155 on Test Set NMSE is 0.108 on Training Set and 0.154 on Test Set NMSE is 0.112 on Training Set and 0.161 on Test Set NMSE is 0.107 on Training Set and 0.152 on Test Set NMSE is 0.108 on Training Set and 0.154 on Test Set NMSE is 0.107 on Training Set and 0.154 on Test Set NMSE is 0.110 on Training Set and 0.157 on Test Set NMSE is 0.109 on Training Set and 0.158 on Test Set NMSE is 0.107 on Training Set and 0.152 on Test Set NMSE is 0.107 on Training Set and 0.155 on Test Set NMSE is 0.106 on Training Set and 0.155 on Test Set NMSE is 0.106 on Training Set and 0.154 on Test Set NMSE is 0.106 on Training Set and 0.151 on Test Set NMSE is 0.108 on Training Set and 0.157 on Test Set NMSE is 0.106 on Training Set and 0.154 on Test Set NMSE is 0.111 on Training Set and 0.150 on Test Set NMSE is 0.110 on Training Set and 0.161 on Test Set NMSE is 0.105 on Training Set and 0.153 on Test Set NMSE is 0.106 on Training Set and 0.151 on Test Set NMSE is 0.106 on Training Set and 0.151 on Test Set NMSE is 0.105 on Training Set and 0.155 on Test Set NMSE is 0.105 on Training Set and 0.155 on Test Set NMSE is 0.105 on Training Set and 0.154 on Test Set NMSE is 0.105 on Training Set and 0.152 on Test Set NMSE is 0.105 on Training Set and 0.155 on Test Set NMSE is 0.105 on Training Set and 0.153 on Test Set NMSE is 0.107 on Training Set and 0.152 on Test Set NMSE is 0.108 on Training Set and 0.152 on Test Set NMSE is 0.104 on Training Set and 0.156 on Test Set NMSE is 0.104 on Training Set and 0.153 on Test Set NMSE is 0.104 on Training Set and 0.157 on Test Set NMSE is 0.106 on Training Set and 0.155 on Test Set NMSE is 0.103 on Training Set and 0.158 on Test Set NMSE is 0.104 on Training Set and 0.154 on Test Set NMSE is 0.102 on Training Set and 0.157 on Test Set NMSE is 0.109 on Training Set and 0.155 on Test Set NMSE is 0.102 on Training Set and 0.156 on Test Set NMSE is 0.103 on Training Set and 0.152 on Test Set NMSE is 0.102 on Training Set and 0.156 on Test Set NMSE is 0.104 on Training Set and 0.152 on Test Set NMSE is 0.102 on Training Set and 0.157 on Test Set NMSE is 0.103 on Training Set and 0.162 on Test Set NMSE is 0.101 on Training Set and 0.158 on Test Set NMSE is 0.102 on Training Set and 0.160 on Test Set NMSE is 0.102 on Training Set and 0.161 on Test Set NMSE is 0.104 on Training Set and 0.154 on Test Set NMSE is 0.102 on Training Set and 0.160 on Test Set NMSE is 0.102 on Training Set and 0.160 on Test Set NMSE is 0.100 on Training Set and 0.157 on Test Set NMSE is 0.105 on Training Set and 0.153 on Test Set NMSE is 0.100 on Training Set and 0.155 on Test Set NMSE is 0.101 on Training Set and 0.154 on Test Set NMSE is 0.100 on Training Set and 0.158 on Test Set NMSE is 0.107 on Training Set and 0.170 on Test Set - stopping Training and restoring Weights ... NMSE is 0.141 on Training Set and 0.138 on Test Set Year Sunspots Open-Loop Prediction Closed-LoopPrediction 1960 0.572 0.5320.532 1961 0.327 0.3340.301 1962 0.258 0.1580.146 1963 0.217 0.1560.098 1964 0.143 0.2360.149 1965 0.164 0.2300.273 1966 0.298 0.2630.405 1967 0.495 0.4540.552 1968 0.545 0.6150.627 1969 0.544 0.5500.589 1970 0.540 0.4740.464 1971 0.380 0.4550.305 1972 0.390 0.2700.191 1973 0.260 0.2750.139 1974 0.245 0.2110.158 1975 0.165 0.1810.170 1976 0.153 0.1280.175 1977 0.215 0.1510.193 1978 0.489 0.3160.274 1979 0.754 0.6220.373
- 05-01-2012, 02:51 AM #17
- 05-01-2012, 06:50 AM #18
انا بصراحه مفهمتش اخر مشاركه ممكن توضيح ؟!
- 05-01-2012, 10:10 AM #19
الله يعطيك العافيه على هذا المجهود الرائع
والله يجعله في موازين حسناتك
هل هو اكسبيرت مضاعفات
انا عملت اختبار لاربعه الاشهر الاخير ولكن النتائح طلعت عندي ماهي تمام
وين الهدف وكيف اغيره وياليت تعطينا معاني المصطلحات التى في اليسارآخر تعديل بواسطة ساري الليل ، 05-01-2012 الساعة 10:21 AM
- 05-01-2012, 01:14 PM #20
شهرين ارى انها غير كافيه
اقل شي سنتين باك تست
وما اضن ضروري تاصل لعدد 5000 صفقه خلال سنتين
قلت كلام مهم جدا/ اذا اختبرت اي ستراتيجيه تحصلها فاشله والا غير مجديه على المدى الطويل
حاول التالي
لا تستخدم اهداف بارقام، خلي الدخول والخروج بناء على مؤشرات
يعني ما فيه ستوب لوز
لا تستخدم المضاعفات او الهدج
اول اختبارات مهما كانت سيئه، حاول تفلتر بمؤشر اخر، واستمر طريقه متعبه بس نتائيجه مذهله
- 05-01-2012, 02:44 PM #21
بمجرد ماتكتب أول سطرين علي جوجل تجده مو قع أنجليزي به أفكار شبكات عصبية
أنا أتكلم عن موقع أجنبي به شبكات عصبية لتوقع سلسلة من الأرقام بالمستقبل وأقول ممكن واحد يفكر كيف يمكن يستغل الفكرة لعل وعسي نجلب بها أكسبرت ناجح .
يعني ممكن واحد بعد مايفهم الفكرة يجرب توقع الموفنج أفرج أو توقع مؤشر معين أين سيتجه أو غير ذلك لعمل نظام يتوقع السعر.
سيقول واحد أنا أعتبر توقع المستقبل كلام فاضي ولا أثق به.أقول لا يوجد واحد يتاجر بالفوركس هدفه ليس توقع الأسعار بالمستقبل سواء كان يحلل يدوي سواء كان يستعمل أكسبرت سواء كان يعمل بأستراتيجية. كلهم هدفهم واحد هو توقع الأسعار بالمستقبل .
وبالنسبة لي أنا الجزء الذي يهمني بالباك تست هو الجزء الثاني المبين بالصورة بداية من تاريخ 2/5 والذي سأقرر بعده أستعمال الأكسبرت علي الحقيقي أم لا
لا هو أكسبرت هدفه 30 نقطة ووقفه 30 نقطة وليس أكسبرت مضاعفات ولكنه أكسبرت به أدارة مالية تزيد اللوتات مع زيادة الرصيد وتقل اللوتات مع نقص الرصيد وأكيد الباك تست سيختلف عندك لأنه توجد فروقات سبريد بين البروكر الذي عملت عليه باك تست وبروكرك . ولا يوجد هدف أو أستوب لأنه مبرمج علي أن يعطي باك تست ممتاز عندما يكون الهدف والأستوب 30 ولو أختلف ستختلف كل الحسابات.
شكرا علي المعلوماتآخر تعديل بواسطة Ram22 ، 05-01-2012 الساعة 03:06 PM
- 05-01-2012, 07:54 PM #22
عندك رقم الميجك بالأول وهو ليفرق بين الصفقات وهو ليس مهم كثيرا بهذا الأكسبرت أتركه أو غيره لا مشكلة .
والخيار الثاني لو جعلته true ستستعمل أدارة مالية وأكتب تحتها نسبة المخاطرة 1 % أو حتي 30 % المهم لا تتجاوز 100%
والخيار اللي بعده أقصي لوت يعني بعد ما تصل الصفقة لهذا الحجم لا يمكن أن يزيد الأكسبرت لوتات أكثر عندما يكبر الرصيد
والخيار الليبعده حجم اللوت بحالة أنك لا تستعمل أدارة مالية وجعلتها false
والخياراللي بعده ستغير الرقم من 0 الي 1 بخطوة 0.1 وكلما كبر الرقم وأقترب من 1 زاد تدقيق الأكسبرت علي نجاح صفقات الشراء وقلت عدد الصفقات وكلما أقترب من 0 زادت عدد الصفقات وزادت الصفقات الخسارة ايضا.
والخيار اللي بعده نفس ما سبق ولكنه خاص بالبيع وليس الشراء
يعني مثلا لو جعلتها 0.6 ثم 0.9 ومخاطرة 30 % وأقصي لوت 100
بالأول سيعمل صفقات كثيرة ويحول 10000 الي 2166353 والثاني صفقات قليلة وناجحة أكثر ويحول 10000 الي 286808 من 2/8/2011 الي تاريخ اليوم
وهذه ميزة أن تستعمل أكسبرت بهدف ووقف خسارة متساوي أو أستوب أقل من الهدف
ستستطيع أن تضيف أدارة مالية لو نجحت الصفقات وتحول الرصيد الي هذا المبلغ الكبير
بينما المضاعفات أو الأسكالبنج مستحيل تعمل هذا خصوصا بهذه المدة
ولهذا السبب لو وجدت أكسبرت صفقاته ناجحة بالمستقبل بهدف أكبر من أو يساوي الأستوب فأنك وجدت كنز
آخر تعديل بواسطة Ram22 ، 05-01-2012 الساعة 08:20 PM
- 06-01-2012, 09:59 AM #23
يوجد برنامج وجدته يبين فكرة التوقع لأرقام متسلسلة بالشبكة العصبية بعد أن يعرف ويفهم المسار الذي تتخذه الأرقام يتوقع الأرقام القادمة وأنا أقول لو الواحد فهم الفكرة جيدا ممكن يعمل نظام يتوقع السعر أو أكسبرت قليل المخاطرة ربحه معقول بتوقع أشياء معينة مثل الهاي القادم للخمس شمعات القادمة أو حركة adx أو حركة موفنج أفرج المهم أن تكون التوقعات الناجحة أكبر من التوقعات الخاسرة.
لأني لاحظت أن اللذين يهتمون بالأكسبرتات نوعين:
النوع الأول من يهتم بالباك تست كثيرا ويعتبره هو الأساس.
النوع الثاني من يقول أنا لا أعتمد علي الباك تست وأعمل أكسبرت ولا أنظر حتي النظر للباك تست.
أما الأول فممكن يقع بأكسبرت برمجه واحد لا يهمه ضياع وقت الناس ويقول لهم هذا الأكسبرت عملت به الملايين ويريهم الباك تست ثم يكبر الوقف ويجعله سكالبنج أو مضاعفات بعد أن يتأكد من أن الدروداون قليل
وبعد شهر أوشهرين يأتيه المارجن وأكثر ناس يتلاعبوا بهذه الأشياء الروس .لو كان عندهم أكسبرت يعمل هذه المبالغ الكبيرة فلا أظن أنهم سيضعوه علي طبق من ذهب.
وبهذا يكونوا تحايلوا مرتين مرة علي الباك تست ومرة علي فترة التجربة ديمو أو حقيقي.
النوع الثاني سيضيع وقته علي تجارب ديمو وحقيقي وهو قادر علي عمل تجارب مشابهة بضغطة زر واحدة لو أتبع القواعد السليمة.آخر تعديل بواسطة Ram22 ، 06-01-2012 الساعة 10:25 AM
- 07-01-2012, 12:58 AM #24
أنا أريد أن أعمل أكسبرت أخر أفضل من هذا الأكسبرت يكون ناجح أكثر والمفروض كل من لديه فكرة عن البرمجة يحاول يعمل تجارب علي التوقع للمستقبل ويأخذ أفكار من البرنامج والأكواد التي وضعتها هنا ويخبرنا بالنتائج .لنكتشف أكسبرت أنجح ونعمله. أنا متأكد أنه يوجد كثير يفهموا بالبرمجة هنا ولكنهم لا يشاركوا كثيرا.
أنا ايضا بدأت أعمل تجارب علي هذا الموضوع .
فمثلا الأكسبرت أظن والله أعلم أنه لن ينجح بالمستقبل وأحاول أن أجد أكسبرت أخر أفضل منه
والصور رقم 1 تبين ما اعتقده أن يحدث للأكسبرت والله أعلم
وقمت بتجارب أخري لعمل أكسبرت علي شارت 15 دقيقة والصور 2 تبين ماوجدته
أنا ممكن أضع أكسبرت جاهز هنا بباك تست أفضل وفترة أطول من الفترة التي وضعتها هنا ولكن الذي يهمني هو التوقع بالمستقبل وليس نجاحه علي الباك تست والماضي.
- 07-01-2012, 01:25 AM #25
اخي العزيز Ram واضح انك بتقول كلام كبير جدا جدا جدا جدا
واحب اقولك اني مش فاهم اي حاجه من اللي انت حاطه ده ههههههههههه
واحده واحده معايا ربنا يكرمك ...التوقعات اللي انت حاططها دي انت اللي عملتها يدوي ولا مين اللي بيحسبها ؟
- 07-01-2012, 03:00 AM #26
التوقع بالماضي : أقصد الفترة التي عملت عليها باك تست وغيرت الأعدادات حتي أصبحت النتيجة أفضل مايمكن.
والتوقع بالمستقبل : أقصد الفترة التي تلي الباك تست المحسن ولكنك لم تعدل عليها شئ وستختبر بها قدرة الأكسبرت علي أن يحسنها لوحده بدون ماتعمل لها أوبتومايزيشن أنت.
والهدف من الفكرة هو أختبار قدرة الأكسبرت علي أن يعمل علي الحقيقي وينجح لأنه بالحقيقي مستحيل أن تعمل له أوبتومايزيشن لأن الأوبتومايزيشن مبني علي علمك بالأسعار المستقبلية.
يعني لو فرضنا أنك عملت أوبتومايزيشن للأكسبرت من 2/8/2011 حتي تاريخ 12 / 10 / 2011 ثم جربت الأكسبرت علي الفترة من 12/10/2011 الي تاريخ اليوم بدون ما تعمل عليه أي تعديلات جديدة فأن الصور توضح المقصود من الفكرة.
وهذه الفكرة ليست جديدة ولكن Better الذي فاز بالترتيب الأول سنة 2007 بمسابقة الأكسبرتات كان يعملها.
- 07-01-2012, 12:17 PM #27
لحد دلوقتي كلام جميل .. الماضي مافيش عليه مشاكل لان كلنا عارفين معني باكتيست
لكن بالنسبه للمستقبل .. علشان نختبره بنعمل فورورد تيست لمده معقوله علشان نشوف الاكسبيرت ادائه ايه مع السوق الحقيقي
البرنامج اللي انت حاطه اللي اسمه Approximation ده دوره ايه في الفورورد تيست ؟
- 07-01-2012, 05:02 PM #28
تحياتي الخالصه من القلب
اخي الفاضل لدي مؤشر يرسم المتوقع ولكنه للامانه ليس دقيق واحيانا يختلف تماما عن الحقيقه وتتغير قيم شموةعه المستقبليه فهل يفيدك في شيء؟
اذا كا قد يفيدك لدراسه او رؤيه ان تحت امرك استطيع رفعه لكم ولكن احذر الاخوه من اعتباره مؤشر قوي وانصح بعدم الاعتماد عليه ولكن اذا كان ممكن ان تستفيد منه كاعدادات او فكره او العمليه الحسابيه استطيع رفعه باذن الله
- 07-01-2012, 10:01 PM #29
التجربة علي الديمو والحقيقي مهمة ولكن الكثير يعملها مبكرا جدا .
لأن الباك تست قريب من الحقيقي ولا تحصل مشاكل الا لو وجدت عيوب بالأكسبرت أو البروكر يتأخر أو غير ذلك وكلها ليست مشاكل كبيرة وممكن تحل.
لكن المشكلة الكبيرة هي أن تحصل علي أكسبرت يتوقع المستقبل وينجح دائما .
يعني ممكن بدلا من أن تجرب الأكسبرت شهر كامل ديمو تنتظر شهر بعد عمل الأوبتومايزيشن ثم تعمل له فورورد تست علي الباك تست ولو نجحت تجارب الفورورد تست علي الباك تست ستعمل تجارب علي الديمو والحقيقي بفترة قصيرة وتحل أي مشكلة للأكسبرت تحدث علي الحقيقي حتي تجعله قريب من الباك تست.
والبرنامج Approximationهو برنامج شبكة عصبية وجدته يبين التدريب علي سلسلة من الأرقام.
جزاك الله خيرا
ولكني أريد أن أجرب طرق جديدة لم أجربها بالماضي لعل وعسي يكون بها طريقة للربح ونحن لا نعلمها .
أنا وضعت أكواد لبرنامج توقع سلسلة من الأرقام بالشبكة العصبية وبرنامج وجدته يبين الفكرة وتحتاج واحد يفهمها ويعمل عليها تجارب عند وجود وقت فاضي لعل يكون بها أسرار لانعلمها.
أنا أستعمل شبكة عصبية وأحاول أن اعمل أكسبرت ناجح بها والأكسبرت الموجود هنا يعمل بشبكة عصبية ولكن كما رأيت الباك تست ناجح والمستقبل غير ناجح وربما توجد انواع أخري تتوقع أفضل والله أعلم.
- 23-05-2012, 02:14 AM #30
رد:هل ينجح هذا علي الحقيقي؟
هذا باك تست من 30 -1-2012 الي 22-5-2012 علي نفس البروكر وبنفس الاكسبرت بالمشاركة الاولي يعني مثل ماقلت لكم بالبداية الباك تست لن ينجح
بدل مانجرب ديمو كم شهر اختصرنا الوقت وجربناه فورورد تست علي الباك تست