performance - Timing test on azure ml -
i have created data sets of various sizes 1gb, 2gb, 3gb, 4gb (< 10 gb) , executing various machine learning models on azure ml.
1) can know server specifications (ram, cpu) provided in azure ml service.
2) @ times reader says "memory exhaust" >4gb of data.though azure ml should able handle 10gb of data per documentation.
3) if run multiple experiments(in different tabs of browser) in parallel, taking more time.
4) there way set ram, cpu cores in azure ml
i have partial answer: 1. no, it's abstracted
the following types of data can expand larger datasets during feature normalization, , limited less 10 gb:
sparse categorical strings binary data
(see this)
i'm not sure, while working on it, didn't experience change when running single experiment , multiple experiment
you can scale machines in standard tier (see this)
Comments
Post a Comment