Popular Posts

Saturday, August 9, 2014

Installing Hadoop 1.2.1(Single node) in Ubuntu 14.04 with openjdk-7

Procedure for installing Hadoop(Single Node) on Ubuntu 14.04

 with openjdk-7

In this tutorial i just explain steps to install and configure hadoop1.2.1 in ubuntu 14.04 with openjdk-7.

Step 1: Prerequisites

1.Download Hadoop 1.2.1

Hadoop 1.2.1 can be downloaded from here.Please download a stable version of Hadoop.

2.Install openjdk-7

Hadoop need a working Java for that open a new terminal and run the following commands.
  • bimal@bimal:~$ sudo apt-get update
  • bimal@bimal:~$ sudo apt-get upgrade
  • bimal@bimal:~$ sudo apt-get install openjdk-7-jdk
After the installation of Java we need to add JAVA_HOME to Ubuntu environment for that purpose you need to edit /etc/environment.
  • bimal@bimal:~$ sudo gedit /etc/environment
Then you need to append the following line to the file.
  • JAVA_HOME=/usr/lib/jvm/java-7-openjdk-amd64
3.Add a dedicated user

    we need to add a dedicated user(hduser) and also create a group hadoop because during the installation and configuration of hadoop we need a sepration and privecy from other users.
  • bimal@bimal:~$ sudo addgroup hadoop
  • bimal@bimal:~$ sudo adduser --ingroup hadoop hduser


4.Installing and configuring SSH Server

Hadoop requires SSH access to manage its nodes, i.e. remote machines plus your local machine if you want to use Hadoop on it you need to configure SSH server.For configuring SSH server use the following commands.
First you need to install openssh server by using the following command.
  • bimal@bimal:~$ sudo apt-get install openssh-server


After installation you need to open a new terminal and switch the user to hduser by using the following commands.
  • bimal@bimal:~$ su – hduser

After switching to hduser you need to create the SSH key using the following commands.

  • hduser@bimal:~$ ssh-keygen -t rsa -P ""

The above command will produce the RSA key.The above output shows the RSA key.

Then we need to enable SSH by using the following command.The below command will copy the key to authorized_keys.
  • hduser@bimal:~$ cat $HOME/.ssh/id_rsa.pub >> $HOME/.ssh/authorized_keys
The last and final step is to connect our local machine with hduser.The step is also needed to save our local machine’s host key fingerprint to the hduser user’s known_hosts file.
  • hduser@bimal:~$ ssh localhost

If you get any error during the above step trying to reinstall the openssh-server.

5. Disable IPV6

Before going to disable ipv6 you need to repeatdly type exit and open a new terminal and edit the file /etc/sysctl.conf using the following command.
  • hduser@bimal:~$ sudo gedit /etc/sysctl.conf
Add the the following line of code at the end of the file
  • net.ipv6.conf.all.disable_ipv6 = 1
  • net.ipv6.conf.default.disable_ipv6 = 1
  • net.ipv6.conf.lo.disable_ipv6 = 1

You need to restart you system to get the effect of changing the file.

You can check whether the ipv6 is disabled or not by using the following command
  • hduser@bimal:~$ cat /proc/sys/net/ipv6/conf/all/disable_ipv6

If it return a value 0 then ipv6 is not disabled.If it return a value 1 it means ipv6 is disabled.

Step 2 .Install Hadoop

1.Extract and Modify Permissions

First move the hadoop package to /usr/local.Then change the directory to /usr/local.Then extract the package using tar command.Then move the extraced file to the directory hadoop after that change the owner of the hadoop directory, all files and directorys in it.The following commands are using for this purpose.
  • sudo mv /home/bimal/Downloads/hadoop-1.2.1.tar.gz /usr/local
  • cd /usr/local
  • sudo tar xzf hadoop-1.2.1.tar.gz
  • sudo mv hadoop-1.2.1 hadoop
  • sudo chown –R hduser:hadoop hadoop
Note:Sometimes -R option make some error the just use option –recursive

2. Update ‘$HOME/.bashrc’ of hduser

first open $HOME/.bashrc of hduser
  • hduser@bimal:~$ sudo gedit /home/hduser/.bashrc
After opening the file you need to append the following line of code to the end of the file.

# Set Hadoop-related environment variables
export HADOOP_PREFIX=/usr/local/hadoop
# Set JAVA_HOME (we will also configure JAVA_HOME directly for Hadoop later on)
export JAVA_HOME=/usr/lib/jvm/java-7-openjdk-amd64
# Some convenient aliases and functions for running Hadoop-
#related commands
unalias fs &> /dev/null
alias fs="hadoop fs"
unalias hls &> /dev/null
alias hls="fs -ls"
# If you have LZO compression enabled in your Hadoop cluster #and compress job outputs with #LZOP (not covered in this #tutorial): Conveniently inspect an LZOP compressed file from the #command
# line; run via:
#
# $ lzohead /hdfs/path/to/lzop/compressed/file.lzo
#
# Requires installed 'lzop' command.
#
lzohead () {
hadoop fs -cat $1 | lzop -dc | head -1000 | less
}
# Add Hadoop bin/ directory to PATH
export PATH=$PATH:$HADOOP_PREFIX/bin


Step 3: Configuring Hadoop

Now we have to configure the directory where Hadoop will store its data files, the network ports it listens to, etc.

1.Assigning working directory

We will use the directory ‘/app/hadoop/tmp’ . Hadoop’s default configurations use hadoop.tmp.dir as the base temporary directory both for the local file system and HDFS. Now we create the directory and set the required ownerships and permissions.

  • hduser@bimal:~$ sudo mkdir -p /app/hadoop/tmp
  • hduser@bimal:~$ sudo chown hduser:hadoop /app/hadoop/tmp
  • hduser@bimal:~$ sudo chmod 750 /app/hadoop/tmp
If you forget to set the required ownerships and permissions, you will see a java.io.IOException when you try to format the name node in the next section.

2.Configuring Hadoop setup files

I. hadoop-env.sh

The only required environment variable we have to configure for Hadoop is JAVA_HOME.
Replace
# The java implementation to use. Required.
#export JAVA_HOME=/usr/lib/jvm/
With
# The java implementation to use. Required.
export JAVA_HOME=/usr/lib/jvm/java-7-openjdk-amd64


ii.core-site.xml


open the file core-site.xml and add the following lines of code between <configuration>...</configuration>.

  • hduser@bimal:~$ sudo gedit /usr/local/hadoop/conf/core-site.xml
<property>
<name>hadoop.tmp.dir</name>
<value>/app/hadoop/tmp</value>
<description>A base for other temporary directories.</description>
</property>

<property>
<name>fs.default.name</name>
<value>hdfs://localhost:54310</value>
<description>The name of the default file system. A URI whose
scheme and authority determine the FileSystem implementation. The
uri's scheme determines the config property (fs.SCHEME.impl) naming
the FileSystem implementation class. The uri's authority is used to
determine the host, port, etc. for a filesystem.</description>
</property>

III. mapred-site.xml

Open file /usr/local/hadoop/conf/mapred-site.xml and we need to append the following code between <configuration>...</configuration>.
  • hduser@bimal:~$ sudo gedit /usr/local/hadoop/conf/mapred-site.xml
<property>
<name>mapred.job.tracker</name>
<value>localhost:54311</value>
<description>The host and port that the MapReduce job tracker runs at. If "local", then jobs are run in-process as a single map and reduce task.</description>
</property>

IV. hdfs-site.xml

Open up the file /usr/local/hadoop/conf/hdfs-site.xml
  • hduser@bimal:~$ sudo gedit /usr/local/hadoop/conf/hdfs-site.xml
Add the following lines of code between the <configuration>...</configuration>

<property>
<name>dfs.replication</name>
<value>1</value>
<description>Default block replication. The actual number of replications can be specified when the file is created. The default is used if replication is not specified in create time.</description>
</property>


Step 4: Formatting the HDFS Filesystem via the Namenode


open a new terminal and switch to hduser
  • bimal@bimal:~$ su – hduser
  • hduser@bimal:~$ /usr/local/hadoop/bin/hadoop namenode -format
Step 5: Starting your Single-Node Cluster

  • hduser@bimal:~$ /usr/local/hadoop/bin/start-all.sh
After the completion of the above command run the below command to check which of the nodes are started.
  • hduser@bimal:/usr/local/hadoop$ jps


You can also check if Hadoop is listening on the configured ports. Open a new terminal and run
  • sudo netstat -plten | grep java
Output


Step 6: Stopping your Single-Node Cluster

To stop all the daemons running on your machine, run the command.
  • hduser@bimal:~$ /usr/local/hadoop/bin/stop-all.sh



108 comments:

  1. Please add info about how to configure multinode. Thank you !

    ReplyDelete
  2. This command is not working: sudo chown –R hduser:hadoop hadoop. Any ideas why?

    ReplyDelete
    Replies
    1. $sudo chown hduser:hadoop -R hadoop
      Try this.. Replace hduser:hadoop with yours

      Delete
    2. $sudo chown hduser:hadoop -R hadoop
      Try this.. Replace hduser:hadoop with yours

      Delete
  3. I can not get Step 5 to work. Here is the out put. Any ideas what I have done wrong.

    ReplyDelete
  4. the comment should be:
    $ sudo chown -R hduser:hadoop /usr/local/hadoop

    or
    $ cd/usr/local/
    hduser@deepen-HP:/usr/local$ sudo chown –R hduser:hadoop hadoop

    ReplyDelete
  5. Nice post thanks for sharing. If any one need Hadoop Interview Questions & Answers and Free Material Click Here

    ReplyDelete
  6. Thankful for sharing this productive information to our visionInformatica Training in Chennai | <a href="http://www.besanttechnologies.com/training-courses/oracle-training>Oracle Training in Chennai</a>

    ReplyDelete
  7. Its a useful information.Simple and superb Hadoop Training Chennai Thanks for sharing...!

    ReplyDelete
  8. Nice information dot net training in chennai Its really usefull to me.Thanks for sharing

    ReplyDelete
  9. we'll install a single-node Hadoop cluster backed by the Hadoop .
    Web Designing Training in Chennai

    ReplyDelete
  10. Very useful information, thank you for sharing..i also learned Oracle DBA and PHP

    ReplyDelete
  11. Wow, brilliant article that I was searching for. Helps me a lot in taking class for my students, so using it in my work. Thanks a ton. Keep writing, would love to follow your posts.
    Shashaa
    best Dot Net training institute in Chennai | best Dot Net training institute in Chennai | best Dot Net training institute in Chennai

    ReplyDelete
  12. very nice blogs!!! i have to learning for lot of information for this sites...Sharing for wonderful information.Thanks for sharing this valuable information to our vision. You have posted a trust worthy blog keep sharing.
    Best Oracle Training in Chennai

    ReplyDelete
  13. Some topics covered,may be it helps someone,HDFS is a Java-based file system that provides scalable and reliable data storage,
    and it was designed to span large clusters of commodity servers. HDFS has demonstrated production scalability of up to 200 PB of storage and a single cluster of 4500 servers, supporting close to a billion files and blocks.
    http://www.computaholics.in/2015/12/hdfs.html
    http://www.computaholics.in/2015/12/mapreduce.html
    http://www.computaholics.in/2015/11/hadoop-fs-commands.html

    ReplyDelete
  14. This comment has been removed by the author.

    ReplyDelete
  15. Kentucky Derby 2016 live

    http://kentuckyderby2016live.co/
    http://kentuckyderbylive.xyz/

    http://kentuckyderby2016live.co/
    http://kentuckyderbylive.xyz/

    ReplyDelete
  16. The author has done a great job in Installing Hadoop 1.2.1(Single node) in Ubuntu 14.04 with openjdk-7. I found it comprehensively informative! . I found one more information rich and interesting resource related to Hadoop tutorial while doing searches on the Internet.

    ReplyDelete
  17. Thanks for sharing this article.. You may also refer http://www.s4techno.com/blog/2016/07/11/hadoop-administrator-interview-questions/..

    ReplyDelete
  18. 100% Job Oriented R Programming Training In Chennai for more Information click to the best shell-perl training in chennai

    ReplyDelete
  19. you are saying Ubuntu software but i should need explanation of how to install this software for redhat?
    hadoop training chennai

    ReplyDelete
  20. This comment has been removed by the author.

    ReplyDelete
  21. This comment has been removed by the author.

    ReplyDelete
  22. Thank you for giving such a valuable inputs.update with more articles also read our blog

    Hadoop training classes in hyderabad

    ReplyDelete
  23. I just want to say I’m new to weblog and certainly savored this page. You actually have outstanding well written articles. Cheers for sharing with us your website.

    Hadoop Training in Chennai

    ReplyDelete
  24. Thanks for sharing this useful articles :) keep sharing

    Oracle RAC Training in Chennai

    ReplyDelete
  25. Thanks for sharing this useful articles :) keep sharing
    Cloud-Computing training in chennai

    ReplyDelete
  26. Thanks for providing this informative information…..
    You may also refer-
    http://www.s4techno.com/blog/category/hadoop/

    ReplyDelete
  27. Great
    post. It really helped me to learn something new. So thanks.

    Linux training institute in bangalore

    ReplyDelete
  28. This comment has been removed by the author.

    ReplyDelete
  29. thank you for sharing this informative blog.. this blog really helpful for everyone.. explanation are clear so easy to understand... I got more useful information from this blog

    hadoop training institute in tambaram | big data training institute in tambaram | hadoop training in chennai tambaram | big data training in chennai tambaram

    ReplyDelete
  30. nice blog. thanks for sharing Hadoop Tutorials. It’s really good. Hadoop is an open source, Java-based programming framework that supports the processing and storage of extremely large data sets in a distributed computing environment…
    Keep sharing on Updated Tutorials????????

    ReplyDelete
  31. yes its really useful info when learning hadoop this tutorial is very helpful.
    Thank you

    ReplyDelete
  32. Excellent article. Very interesting to read. I really love to read such a nice article. Thanks! keep rocking.
    ccna training london

    ReplyDelete
  33. Great articles, first of all Thanks for writing such lovely Post! Earlier I thought that posts are the only most important thing on any blog. But here a Shout me loud found how important other elements are for your blog.Keep update more posts..

    Web Designing Training in Chennai

    Java Training in Chennai

    ReplyDelete
  34. Good work sir, Thanks for the proper explanation about Hadoop shell commands . I found one of the good resource related hadoop fs commands and hadoop tutorial. It is providing in-depth knowledge on hadoop fs commands and hadoop tutorial. which I am sharing a link with you where you can get more clear on hadoop fs commands and hadoop tutorial. To know more Just have a look at this link

    Hadoop Tutorial
    Hadoop fs Commands

    ReplyDelete

  35. Thanks For Sharing Such a Informative Post, Hope You Continue this in Future.I admire Your Post, it very Interesting Post.
    http://www.eduline.com.sg/

    ReplyDelete


  36. Your new valuable key points imply much a person like me and extremely more to my office workers. With thanks.
    UNIX Shell scripting training in chennai
    ORACLE apps finance training in chennai
    Informatica Online Training

    ReplyDelete
  37. Nice post ! Thanks for sharing valuable information with us. Keep sharing..Hadoop Admin Online Training Bnagalore

    ReplyDelete
  38. Thanks for the explanation. It’s really helpful. Please keep sharing.
    Hadoop Course in delhi

    ReplyDelete
  39. Nice post .Really appreciable. Please share more information. Thanks you
    Hadoop training in Noida

    ReplyDelete
  40. Nice blog. Thank you for sharing such useful post. Keep posting
    Hadoop Training in Gurgaon

    ReplyDelete
  41. nice blog, I like your good post, thanks for sharing great information.
    Hadoop Training in Gurgaon

    ReplyDelete
  42. This comment has been removed by the author.

    ReplyDelete

  43. Best Solidworks training institute in noida

    SolidWorks is a solid modeling computer-aided design (CAD) and computer-aided engineering (CAE) computer program that runs on Microsoft Windows. SolidWorks is published by Dassault Systems. Solid Works: well, it is purely a product to design machines. But, of course, there are other applications, like aerospace, automobile, consumer products, etc. Much user friendly than the former one, in terms of modeling, editing designs, creating mechanisms, etc.
    Solid Works is a Middle level, Main stream software with focus on Product development & this software is aimed at Small scale & Middle level Companies whose interest is to have a reasonably priced CAD system which can support their product development needs and at the same time helps them get their product market faster.

    Company Address:
    WEBTRACKKER TECHNOLOGY (P) LTD.
    C-67,Sector-63,Noida,India.
    E-mail: info@webtracker.com
    Phone No: 0120-4330760 ,+91-880-282-0025

    webtrackker.com/solidworks-training-Course-institute-in-noida-delhi

    ReplyDelete
  44. Graphics designing training institute in Noida
    Best Graphics training institute in Noida, Graphic Designing Course, classes in Noida- webtrackker is providing the graphics training in Noida with 100% placement supports. If you are looking for the Best Graphics designing training institute in Noida For more call - 8802820025.

    Graphics designing training institute in Noida, Graphics designing training in Noida, Graphics designing course in Noida, Graphics designing training center in Noida

    Company address:
    Webtrackker Technology
    C- 67, Sector- 63, Noida
    Phone: 01204330760, 8802820025
    Email: info@webtrackker.com
    Website: http://webtrackker.com/Best-institute-for-Graphic-Designing-training-course-in-noida.php

    ReplyDelete
  45. Nice tips. Very innovative... Your post shows all your effort and great experience towards your work Your Information is Great if mastered very well.

    java training in annanagar | java training in chennai

    java training in marathahalli | java training in btm layout

    java training in rajaji nagar | java training in jayanagar

    ReplyDelete
  46. This blog is the general information for the feature. You got a good work for these blog.We have a developing our creative content of this mind.Thank you for this blog. This for very interesting and useful.
    Data Science Training in Chennai
    Data science training in bangalore
    Data science online training
    Data science training in pune
    Data science training in kalyan nagar
    selenium training in chennai

    ReplyDelete
  47. Hello! This is my first visit to your blog! We are a team of volunteers and starting a new initiative in a community in the same niche. Your blog provided us useful information to work on. You have done an outstanding job.

    AWS Online Training | Online AWS Certification Course - Gangboard
    AWS Training in Chennai | AWS Training Institute in Chennai Velachery, Tambaram, OMR
    AWS Training in Bangalore |Best AWS Training Institute in BTM ,Marathahalli

    ReplyDelete
  48. Awesome article. It is so detailed and well formatted that i enjoyed reading it as well as get some new information too.
    python training in chennai
    python training in Bangalore
    Python training institute in chennai

    ReplyDelete
  49. Hi, Great.. Tutorial is just awesome..It is really helpful for a newbie like me.. I am a regular follower of your blog. Really very informative post you shared here. Kindly keep blogging.
    python training in velachery
    python training institute in chennai

    ReplyDelete
  50. Thank you a lot for providing individuals with a very spectacular possibility to read critical reviews from this site.
    python training in velachery
    python training institute in chennai

    ReplyDelete
  51. Hmm, it seems like your site ate my first comment (it was extremely long) so I guess I’ll just sum it up what I had written and say, I’m thoroughly enjoying your blog. I as well as an aspiring blog writer, but I’m still new to the whole thing. Do you have any recommendations for newbie blog writers? I’d appreciate it.

    AWS Interview Questions And Answers

    AWS Training in Bangalore | Amazon Web Services Training in Bangalore

    Amazon Web Services Training in Pune | Best AWS Training in Pune

    AWS Online Training | Online AWS Certification Course - Gangboard

    ReplyDelete
  52. Thanks for such a great article here. I was searching for something like this for quite a long time and at last I’ve found it on your blog. It was definitely interesting for me to read  about their market situation nowadays.
    python training in chennai | python training in chennai | python training in bangalore

    ReplyDelete
  53. This is such a great post, and was thinking much the same myself. Another great update.
    python training in chennai | python training in chennai | python training in bangalore

    ReplyDelete
  54. I found your blog while searching for the updates, I am happy to be here. Very useful content and also easily understandable providing.. Believe me I did wrote an post about tutorials for beginners with reference of your blog. 
    best rpa training in bangalore
    rpa training in bangalore
    rpa course in bangalore
    RPA training in bangalore
    rpa training in chennai
    rpa online training

    ReplyDelete
  55. Sap fico training institute in Noida

    Sap fico training institute in Noida - Webtrackker Technology is IT Company which is providing the web designing, development, mobile application, and sap installation, digital marketing service in Noida, India and out of India. Webtrackker is also providing the sap fico training in Noida with working trainers.


    WEBTRACKKER TECHNOLOGY (P) LTD.
    C - 67, sector- 63, Noida, India.
    F -1 Sector 3 (Near Sector 16 metro station) Noida, India.

    +91 - 8802820025
    0120-433-0760
    0120-4204716
    EMAIL: info@webtrackker.com
    Website: www.webtrackker.com

    ReplyDelete
  56. Some us know all relating to the compelling medium you present powerful steps on this blog and therefore strongly encourage
    contribution from other ones on this subject while our own child is truly discovering a great deal.
    Have fun with the remaining portion of the year.

    Selenium training in bangalore | best selenium training in bangalore | advanced selenium training in bangalore

    ReplyDelete
  57. This is an best post. It is Really very informative concept.I like it and help me to development very well.Thanks alot for this brief explanation and very nice information.Best Hadoop Training In Hyderabad

    ReplyDelete
  58. Good Post, I am a big believer in posting comments on sites to let the blog writers know that they ve added something advantageous to the world wide web.
    python Online training in chennai
    python training institute in marathahalli
    python training institute in btm
    Python training course in Chennai

    ReplyDelete
  59. I ReGreat For Your Information The Information U have Shared Is Fabulous And Interesting So Please keep Updating Us The Information Shared Is Very Valuable Time Just Went On Reading The Article Python Online Course AWS Online Course Data Science Online Course Hadoop Online Course

    ReplyDelete
  60. This is a nice post in an interesting line of content.Thanks for sharing this article, great way of bring this topic to discussion.
    devops online training

    aws online training

    data science with python online training

    data science online training

    rpa online training

    ReplyDelete
  61. Thanks For Sharing The Information The Information Shared Is Very Valuable Please Keep Updating

    Us Time Just Went On Reading The article Hadoop Online Course

    ReplyDelete
  62. Mình đã tìm thấy các thông tin cần thiết ở đây, cảm ơn bạn. Mình cũng muốn giới thiệu về một Công ty dịch thuật uy tín - Công ty cổ phần dịch thuật miền trung - MIDtrans có văn phòng chính tại địa chỉ 02 Hoàng Diệu, TP Đồng Hới, tỉnh Quảng Bình có Giấy phép kinh doanh số 3101023866 cấp ngày 9/12/2016 là đơn vị chuyên cung cấp dịch vụ dịch thuật, phiên dịch dành các cá nhân. Hệ thống thương hiệu và các Công ty dịch thuật con trực thuộc: trung tâm dịch thuật sài gòn 247 địa chỉ 47 Điện Biên Phủ, Phường Đakao, Quận 1 TP HCM, dịch thuật phan thiết, bình thuận : địa chỉ 100 , Lê lợi, TX Phan Thiết là nhà cung ứng dịch vụ dịch thuật uy tín hàng đầu tại Bình Thuận vietnamese translate : dịch vụ dịch thuật cho người nước ngoài có nhu cầu, giao diện tiếng Anh dễ sử dụng; dịch thuật công chứng quận 12 (mười hai) : nhà cung ứng dịch vụ dịch vụ dịch thuật phiên dịch hàng đầu tại Quận 12 (mười hai), TP HCM; dịch thuật đà nẵng midtrans : Địa chỉ 54 Đinh Tiên Hoàng, Quận Hải Châu, TP Đà Nẵng chuyên cung cấp dịch vụ dịch thuật công chứng, dịch thuật chuyên ngành tại Đà Nẵng; dịch thuật hà nội midtrans : địa chỉ 101 Láng Hạ, Đống Đa, Hà Nội là nhà cung ứng dịch vụ biên dịch, phiên dịch chuyên nghiệp tại địa bàn Hà Nội. Chúng tôi chuyên cung cấp các dịch vụ biên dịch và phiên dịch, dịch thuật công chứng chất lượng cao hơn 50 ngôn ngữ khác nhau như tiếng Anh, Nhật, Hàn, Trung, Pháp, Đức, Nga, Tây Ban Nha, Bồ Đào Nha, Ý, Ba Lan, Phần Lan, Thái Lan, Hà Lan, Rumani, Lào, Campuchia, Philippin, Indonesia, La Tinh, Thụy Điển, Malaysia, Thổ Nhĩ Kỳ..vv... Dịch thuật MIDtrans tự hào với đội ngũ lãnh đạo với niềm đam mê, khát khao vươn tầm cao trong lĩnh vực dịch thuật, đội ngũ nhân sự cống hiến và luôn sẵn sàng cháy hết mình. Chúng tôi phục vụ từ sự tậm tâm và cố gắng từ trái tim những người dịch giả.Tự hào là công ty cung cấp dịch thuật chuyên ngành hàng đầu với các đối tác lớn tại Việt nam trong các chuyên ngành hẹp như: y dược (bao gồm bệnh lý), xây dựng (kiến trúc), hóa chất, thủy nhiệt điện, ngân hàng, tài chính, kế toán. Các dự án đã triển khai của Công ty dịch thuật chuyên nghiệp MIDtrans đều được Khách hàng đánh giá cao và đạt được sự tín nhiệm về chất lượng biên phiên dịch đặc biệt đối với dịch hồ sơ thầu , dịch thuật tài liệu tài chính ngân hàng, dịch thuật tài liệu y khoa đa ngữ chuyên sâu. Đó là kết quả của một hệ thống quản lý chất lượng dịch thuật chuyên nghiệp, những tâm huyết và kinh nghiệm biên phiên dịch nhiều năm của đội ngũ dịch giả của chúng tôi. Hotline: 0947688883. email: info@dichthuatmientrung.com.vn . Các bạn ghé thăm site ủng hộ nhé. Cám ơn nhiều

    ReplyDelete
  63. Your good knowledge and kindness in playing with all the pieces were very useful. I don’t know what I would have done if I had not encountered such a step like this.
    Best Content Writing Services in Mumbai

    ReplyDelete
  64. Excellent Articles!!!Really an awesome post for every one... Awaiting for your Feature posts...Big Thanks
    BEST JAVA TRAINING IN CHENNAI WITH PLACEMENT
    Java training in chennai | Java training in annanagar | Java training in omr | Java training in porur | Java training in tambaram | Java training in velachery

    ReplyDelete
  65. Sharing the same interest, Infycle feels so happy to share our detailed information about all these courses with you all! oracle training in chennai & get to know everything you want to about software trainings.

    ReplyDelete
  66. To become successful and good entrepreneurs, they first have to identify the real needs and problems of people and solve them. Thus, enrolling in Entrepreneur Training Courses is the best idea. To know more visit here

    ReplyDelete
  67. Well explained article, loved to read this blog post and bookmarking this blog for future.boss gt 100 linux

    ReplyDelete
  68. It is amazing and wonderful to visit your site. Thanks for sharing information; this is useful to us....

    Artificial Intelligence Institute in Delhi

    ReplyDelete
  69. Visit Bharat Go Digital Academy to learn the digital marketing skills in India.

    ReplyDelete
  70. Your good knowledge and kindness in playing with all the pieces were
    very useful. I don’t know what I would have done if I had not
    encountered such a step like this.
    software testing institute in Chennai
    javascript course in Chennai

    ReplyDelete
  71. I see the greatest content on your blog and I extremely love reading them.
    cyber security course in malaysia

    ReplyDelete
  72. Microsoft Azure certification is a user-based, vendor-neutral IT certification program and it covers the productive use of Microsoft Azure. It is designed for people who have completed the introductory course and have been familiar with the general concepts in Microsoft Azure for less than six months.

    ReplyDelete
  73. Environmental exist new movie foreign office. National for explain PM what reduce more. Teacher wish project son newspaper citizen.entertainment

    ReplyDelete