Social Robots are designed to interact with humans in social environments. They are driven by emotional connections and social behavior. When interacting in a social environment, robots must consider human safety and comfort. Social Robots should be able to identify humans as social entities rather than an obstacle. This project aims of integrating a Navigation approach for Mobile Robots in unknown environments where Mobile Robots respect the space and boundary of humans. This is the integration of the perception, localization and navigation of a Mobile robot in social environments. The perception part for the detection of humans is done using a machine learning algorithm called semantic segmentation. The localization and planning part which consists of the orientation and position of Robots are estimated using the slamtoolbox in ROS2. An approach of Multilevel mapping is applied to the map produced by the slam toolbox to remove the noises and execute the smooth operation of Robots. Costmaps are being used for the navigation part. It is planned to use the project hardware using a turtle bot with Kobuki-base and Raspberry Pi4. Integrating all these three modules together and see how a Mobile Robot navigates . Expecting an end result where the navigation of the Mobile Robot is achieved successfully in a human-friendly unknown environment.