performance - Timing test on azure ml -


i have created data sets of various sizes 1gb, 2gb, 3gb, 4gb (< 10 gb) , executing various machine learning models on azure ml.

1) can know server specifications (ram, cpu) provided in azure ml service.

2) @ times reader says "memory exhaust" >4gb of data.though azure ml should able handle 10gb of data per documentation.

3) if run multiple experiments(in different tabs of browser) in parallel, taking more time.

4) there way set ram, cpu cores in azure ml

i have partial answer: 1. no, it's abstracted

  1. the following types of data can expand larger datasets during feature normalization, , limited less 10 gb:

    sparse categorical strings binary data

(see this)

  1. i'm not sure, while working on it, didn't experience change when running single experiment , multiple experiment

  2. you can scale machines in standard tier (see this)


Comments

Popular posts from this blog

java - Suppress Jboss version details from HTTP error response -

gridview - Yii2 DataPorivider $totalSum for a column -

Sass watch command compiles .scss files before full sftp upload -